Why Deployment Matters in LLM Projects

Editor’s note: Sudip Shrestha is a speaker for ODSC East this May 13th-15th in Boston! Check out his talk, “Building and Deploying LLM Applications: Jupyter to Production in Streamlit and Beyond,” there!

When building an application powered by a Large Language Model (LLM), there are several moving parts that take it from an idea to something usable: writing prompt logic, handling user inputs, structuring backend code, managing API keys, and—most importantly—deploying it so others can use it.

It’s not just about getting the model to respond in Jupyter Notebook. It’s about turning that response into something a stakeholder can see, use, and provide feedback on. And for that, deployment is just as critical as development—if not more.

In-person conference | May 13th-15th, 2025 | Boston, MA

Join us May 13th-15th, 2025, for 3 days of immersive learning and networking with AI experts.

🔹 World-class AI experts | 🔹 Cutting-edge workshops | 🔹 Hands-on Training

🔹 Strategic Insights | 🔹 Thought Leadership | 🔹 And much more!

Jupyter Is Where We Start, Not Where We Stop

Jupyter Notebook is a natural starting point for building with LLMs. It’s great for quick experiments, writing prompt templates, and iterating on early logic. For example, when developing a Q&A chatbot or a summarizer, I often start with a notebook to test model behavior and fine-tune the prompts.

Image: Screenshot of a portion of Q&A app in Jupyter Notebook

But here’s the thing: Jupyter is built for development, not delivery.

You can’t expect a manager or a business user to open a .ipynb file, install packages, run cells in order, and interpret raw model outputs. If the value of your LLM app lies in who uses it and how—it needs to be available through a real interface.

The Easiest Path to Sharing Your LLM App – Streamlit

This is where Streamlit comes in. If you’re a Python user looking to move fast from prototype to shareable app, Streamlit is your best friend.

With just a few lines of Python, you can create an interactive web UI—text boxes, buttons, display panels—and plug it directly into your LLM logic. There’s no need to learn front-end frameworks or write HTML/CSS. That makes Streamlit ideal for internal demos, stakeholder reviews, and even lightweight production tools.

Image: Screenshot of poa rtion of Q&A app using streamlit

For example, imagine you’ve built a question-answering tool using OpenAI’s API. With Streamlit, you can wrap it in a simple interface where users enter a question, hit “Ask,” and get an instant response based on the input—all in under 100 lines of code.

Even better, Streamlit supports easy deployment through its own hosting service (Streamlit Community Cloud).

Streamlit sits in a sweet spot:

  • Easy to learn
  • Fast to build
  • Good enough for MVPs and demos

It’s where you get your ideas in front of people—quickly.

Going Further: FastAPI, Docker, and Production-Grade Stacks

Of course, Streamlit isn’t the only option—and it’s not always the right one, especially for more scalable or backend-heavy use cases.

If you’re building something that needs to integrate with other systems, expose endpoints, or handle more traffic, you’ll want to look at tools like:

Fastapi: Perfect for building REST APIs around your LLM logic. It’s fast (async-friendly), easy to write, and supports all the essentials like input validation and documentation out of the box.

Docker: Great for packaging your app and all its dependencies into a portable container. You build once, and run it anywhere—on the cloud, on-prem, or locally.

CI/CD Pipelines: Tools like GitHub Actions or GitLab CI help automate testing and deployment whenever you update your code.

These tools help you move beyond “cool demo” to “maintainable service.”

That said, not every project needs this level of complexity. The goal is to choose the right tool for the stage you’re in. Start simple, but design with growth in mind.

In-person conference | May 13th-15th, 2025 | Boston, MA

Join us at ODSC East for hands-on training, workshops, and bootcamps with the leading experts. Topics include:

🔹 Introduction to scikit-learn
🔹 Building a Multimodal AI Assistant
🔹 Explainable AI for Decision-Making Applications
🔹 Building and Deploying LLM Applications
🔹 Causal you have
🔹 Adaptive RAG Systems with Knowledge Graphs
🔹 Idiomatic Polars
🔹 Machine Learning with CatBoost

GitHub Is Your Project Backbone

One thing that’s essential no matter what stack you choose? GitHub.

GitHub is more than just a place to store code. It helps you:

  • Track version history
  • Collaborate with teammates
  • Roll back to stable versions
  • Trigger automated deployments via CI/CD
  • Keep your project organized and professional

Having a clean, well-structured GitHub repo is often the first step toward making your project shareable, open-source, or scalable. And it gives your future self—or your future team—a reliable base to build on.

Deployment Is a Core Skill for AI Developers

As AI practitioners, we often spend our time fine-tuning prompts, evaluating outputs, or exploring new models. But too few of us invest time in productizing our work.

Building with LLMs is no longer just a research task—it’s becoming a core part of application development. And in the real world, the ability to ship your model matters just as much as how clever your prompt is.

Knowing how to take a working Jupyter Notebook and turn it into a usable app is what turns your idea into value.

Learn This and More at ODSC East

These are the exact topics we’ll explore in my upcoming hands-on workshop at ODSC East in Boston, Building and Deploying LLM Application: Jupyter to Production in Streamlit and Beyond.

In just 2 hours, we’ll go from:

  • Prototyping an LLM-powered app in Jupyter Notebook
  • Refactoring it in VS Code for structure
  • Using GitHub to manage the codebase
  • Deploying a working interface with Streamlit
  • Talking about alternative deployment stacks like FastAPI and Docker

If you’ve ever built something cool in a notebook but didn’t know how to share it—or you want to take your AI skills to the next level by learning how to deploy what you build—this workshop is for you.

About the Author/ODSC East Speaker:

Dr. Sudip Shrestha leads a data science initiative, finding joy in analyzing data to extract valuable insights. He combines technical skill with the creation of comprehensive business cases and employs a data-centric approach to problem-solving. Dr. Shrestha excels at converting business problems into practical solutions, with a focus on advanced analytics, AI, ML, and NLP. He has a deep passion for data science & AI and is committed to sharing his knowledge by contributing to the data science community.



Source link

For more info visit at Times Of Tech

Share this post on

Facebook
Twitter
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *