Week 5/40 — First “AI-like” Endpoint

Tech stack (this week)

  • Frontend: React + TypeScript

  • Backend: FastAPI

  • “AI-like”: simple structured analysis response (no heavy model)

New topic: designing an AI feature as an API contract (not “just prompts”).

Why this week?

The goal was to learn the integration pattern: define a request/response model, add an endpoint, call it from the UI, render results.

What I shipped

An Analyze button per entry:

  • Backend accepts text

  • Returns a structured response (summary, tags, sentiment)

  • Frontend displays analysis under the entry

The One Feature Rule

One feature: a single analysis endpoint + UI wiring.
Not doing: accuracy, fancy models, caching.

What broke / surprised me

Most pain was environment/process management (ports, reloading processes, what “restart” actually means in Codespaces).

Follow along (code it yourself): This week’s task

Task: Add one analysis endpoint that returns structured results (no ML required).

  1. Define a request schema: { text: string }.

  2. Define a response schema like:

    • summary: string

    • tags: string[]

    • sentiment: "positive" | "neutral" | "negative"

    • reasons: string[]

  3. Implement POST /analyze. Use simple rules (keyword checks) to fill the response.

  4. In the UI, add an “Analyze” button and render the response below the entry.

Success criteria: Clicking “Analyze” shows structured output, and it includes “reasons”.

Extra help & hints

  • Teacher trick: treat “AI” as a function with a contract.

  • If you get stale results: you likely restarted the wrong server.

  • Keep rules obvious: e.g., tags based on keyword presence.

Previous
Previous

Week 6/40 — Making AI Features Feel Real (Caching + Derived Data)

Next
Next

Week 4_b/40 — Adding Persistence with SQLite