Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
freem
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Openai/694ab010-3e7c-8000-b0e8-b98c22b7a063
(section)
Add languages
Page
Discussion
English
Read
Edit
Edit source
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
Edit source
View history
General
What links here
Related changes
Special pages
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== ## === Folder <syntaxhighlight>ml-chatbot-api/ main.py requirements.txt Dockerfile </syntaxhighlight> ===== <syntaxhighlight lang="python">import os ===== from pydantic import BaseModel from fastapi import FastAPI, HTTPException from fastapi.middleware.cors import CORSMiddleware from langchain_openai import ChatOpenAI from langchain_core.messages import SystemMessage, HumanMessage APP_NAME = "ml-chatbot-api" class ChatRequest(BaseModel): message: str class ChatResponse(BaseModel): reply: str def get_openai_key() -> str: # ECS Secrets Manager injection will set OPENAI_API_KEY env var at runtime key = os.getenv("OPENAI_API_KEY", "").strip() if not key: raise RuntimeError("OPENAI_API_KEY is missing (Secrets Manager injection likely not configured).") return key app = FastAPI(title=APP_NAME) === CORS: allow Streamlit UI origin(s). For simplicity in early testing, allow all. === === In production, set allow_origins=["https://YOUR_UI_DOMAIN"]. === app.add_middleware( CORSMiddleware, allow_origins=["*"], allow_credentials=False, allow_methods=["*"], allow_headers=["*"], ) @app.get("/api/health") def health(): return {"status": "ok"} @app.post("/api/chat", response_model=ChatResponse) def chat(req: ChatRequest): msg = (req.message or "").strip() if not msg: raise HTTPException(status_code=400, detail="message is required") try: api_key = get_openai_key() llm = ChatOpenAI( model="gpt-4o-mini", temperature=0.2, api_key=api_key, ) messages = [ SystemMessage(content="You are a helpful assistant. Answer clearly and concisely."), HumanMessage(content=msg), ] resp = llm.invoke(messages) return ChatResponse(reply=resp.content) except HTTPException: raise except Exception as e: raise HTTPException(status_code=500, detail=f"Chat failed: {str(e)}") </syntaxhighlight> ===== <syntaxhighlight lang="txt">fastapi==0.115.6 ===== uvicorn[standard]==0.30.6 pydantic==2.9.2 langchain==0.2.16 langchain-openai==0.1.22 </syntaxhighlight> ===== <syntaxhighlight lang="dockerfile">FROM python:3.11-slim ===== WORKDIR /app RUN apt-get update && apt-get install -y --no-install-recommends curl \ && rm -rf /var/lib/apt/lists/* COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt COPY main.py . EXPOSE 8000 CMD ["uvicorn", "main:app", "--host=0.0.0.0", "--port=8000"] </syntaxhighlight> ==== Folder ==== <syntaxhighlight>ml-chatbot-ui/ app.py requirements.txt Dockerfile </syntaxhighlight> ===== <syntaxhighlight lang="python">import os ===== import requests import streamlit as st st.set_page_config(page_title="ML Chatbot UI", layout="centered") st.title("ML Chatbot (Streamlit UI)") API_BASE_URL = os.getenv("API_BASE_URL", "").rstrip("/") if not API_BASE_URL: st.error("API_BASE_URL is not set. Set it to your API Gateway stage URL, e.g. https://XXXX.execute-api.REGION.amazonaws.com/prod") st.stop() CHAT_URL = f"{API_BASE_URL}/api/chat" HEALTH_URL = f"{API_BASE_URL}/api/health" === Basic health check === with st.expander("Backend status", expanded=False): try: r = requests.get(HEALTH_URL, timeout=5) st.write("Health:", r.status_code, r.text) except Exception as e: st.write("Health check failed:", str(e)) if "history" not in st.session_state: st.session_state.history = [] # list of (role, text) prompt = st.chat_input("Ask something...") if prompt: st.session_state.history.append(("user", prompt)) try: resp = requests.post(CHAT_URL, json={"message": prompt}, timeout=60) if resp.status_code != 200: reply = f"API error {resp.status_code}: {resp.text}" else: reply = resp.json().get("reply", "") if not reply: reply = "Empty reply from API." except Exception as e: reply = f"Request failed: {str(e)}" st.session_state.history.append(("assistant", reply)) for role, text in st.session_state.history: with st.chat_message(role): st.write(text) </syntaxhighlight> ===== <syntaxhighlight lang="txt">streamlit==1.39.0 ===== requests==2.32.3 </syntaxhighlight> ===== <syntaxhighlight lang="dockerfile">FROM python:3.11-slim ===== WORKDIR /app RUN apt-get update && apt-get install -y --no-install-recommends curl \ && rm -rf /var/lib/apt/lists/* COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt COPY app.py . EXPOSE 8501 CMD ["streamlit", "run", "app.py", "--server.address=0.0.0.0", "--server.port=8501"] </syntaxhighlight>
Summary:
Please note that all contributions to freem are considered to be released under the Creative Commons Attribution-ShareAlike 4.0 (see
Freem:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)