Cambrian College Capstone Client: Swedish AI Co. Flask / jQuery

Agile Delivery under Constraints:
Rapid Prototyping an AI Mentor

Mid-way through the project, the client pivoted from Analytics to an AI Mentor product. With only 4 weeks remaining and no access to school infrastructure, I architected a bypass solution to deliver a functional MVP on time.

My Role

Frontend Developer & Integration Owner

Led the technical feasibility analysis and API contract definition.

The Constraint

Strict deadline (4 weeks) + Zero administrative access to the target LMS platform.

The Strategy

Prioritized Time-to-Market by utilizing ephemeral infrastructure (Colab+Ngrok) to bypass DevOps overhead.

1 Feasibility & Trade-off Analysis

The client initially requested a deep plugin integration. I performed a "Spike" (technical investigation) and identified critical blockers. I presented three architecture options to the stakeholders:

Architecture Option Pros Risks / Blockers Verdict
A. MS Teams Bot Native integration. High provisioning latency (>2 weeks); Steep learning curve risking the deadline. Discarded
B. LMS Plugin (LTI) Best User Experience. Blocker: Requires Admin API keys (Access denied by IT Dept). Impossible
C. Decoupled Web App
MVP Approach
Full autonomy; Rapid deployment; Zero dependency on 3rd party permissions. Separate login required (Accepted trade-off for speed). Selected

* Decision Rationale: We chose Option C to decouple our delivery timeline from the school's slow IT approval process.

2 Architecture: Ephemeral Infrastructure

DevOps-Free Data Flow

flowchart LR U((User)) subgraph Public["Client Side"] FE[Frontend App] end subgraph Tunnel["Secure Tunnel"] NG[Ngrok] end subgraph Cloud["Ephemeral Runtime"] FL[Flask API] TF[TensorFlow Model] end U -->|Type Message| FE FE -->|AJAX Request| NG NG -.->|Forward| FL FL -->|Inference| TF TF -->|Result| FL FL -->|JSON Response| FE style NG stroke:#3B82F6,stroke-width:2px,color:#fff style Cloud stroke:#262626,fill:#1A1A1A style Public stroke:#262626,fill:#1A1A1A

Why "Ephemeral" Environments?

Setting up a production-grade Cloud GPU environment typically takes days of configuration and approval. By leveraging Ephemeral Environments (Google Colab runtimes exposed via Ngrok), we achieved:

  • ZERO DEVOPS OVERHEAD

    Eliminated the need for Dockerization and server provisioning. The Data Scientists could deploy updates simply by re-running a notebook cell.

  • RAPID ITERATION LOOPS

    Enabled real-time integration testing. I could test my frontend against the latest model version instantly, without waiting for CI/CD pipelines.

3 Frontend Implementation

I implemented the client using jQuery for rapid prototyping. The core challenge was managing the user's perception of latency (approx 1.5s inference time) to maintain a smooth conversational flow.

func.js Async Polling Logic
function sendRequest() {
    // 1. Optimistic UI: Immediate feedback
    showLoadingState(true);

    $.ajax({
        url: "/api?nm=" + $("#user-input").val(),
        type: "GET",
        headers: {
            'Access-Control-Allow-Origin': '*',
        },
        cors: true,
        success: function(data) {
            console.log("Intent Detected: ", data.res);
            
            // 2. Simulate natural typing delay
            setTimeout(function () {
                 robotResponse(data.res);
            }, 2000);
        },
        error: function() {
             renderError("Service unavailable");
        }
    });
}

Handling Latency

The remote inference wasn't instant. I implemented a typing indicator and used `setTimeout` to bridge the gap between request and response, preventing the interface from feeling "frozen".

MVP Validation

Used lightweight client-side validation for Student IDs (`A + numbers`). This allowed us to secure the demo without spending time building a complex authentication backend.

Project Retrospective

Key Success Factors

  • Decoupled Architecture: Defining the JSON contract early meant Frontend and Data Science teams never blocked each other.
  • Bias for Action: Choosing Option C (Web App) over the stalled LMS integration saved the project from administrative limbo.

Evolution & Growth

  • Then: Focused on making it work now using jQuery and Tunnels.
  • Now: I would containerize the model (Docker) for stability and use React for better state management, but the strategic decision to use "Hackathon tech" for an MVP remains valid.