Diving (back) into Python - Streamlit
08/01/2025
As part of my ongoing dive back into Python and a parallel exploration into tooling around AI, I've been building a few proof of concept applications and have needed to throw together a quick UI. I spent a little time looking at the options in the space that could save me time versus using something more heavyweight like React and evaluated Streamlit, Plotly Dash and Shiny for Python. Streamlit was the one that I felt was the best fit for what I was doing because most of my applications were text based and didn't have many visualisations, which is where the other two excel.
About Streamlit
Streamlit is a python based framework that in its own words:
Streamlit is an open-source Python framework for data scientists and AI/ML engineers to deliver interactive data apps – in only a few lines of code.
I think it actually lives up to the billing! Technically, the way that it operates is that you define a python script that's run by the main Streamlit web server, and python statements in the script get translated to UI elements. Every time the UI is updated, the script runs. This means that creating an application is very simple. There's built in widgets that do most of the things that you'd expect - inputs, selects through to images, data tables and graphs. It's also well integrated into the Python data ecosystem and has integration with libraries like Pandas.
The good
Firstly, the speed of getting started is really quick and you get immediate feedback on how it's working, with live reloads happening as you modify the script. There's good integration between the product and the Python data ecosystem with strong visuals coming from lots of different visualisation libraries, builtin integration with Pandas dataframes and there's a good ecosystem of plugins. I was fairly quickly able to get a sample LLM app up and running and integrated with one of the agentic AI frameworks.
The session state system is relatively straightforward and caching of results is possible. The fact that the script is re-run for every state change makes this quite essential and does make it more complex, so you lose some of the initial simplicity.
Deployment
Getting the app up and running was fairly straightforward too, I deployed it as an Azure App Service for Containers via Bicep and used github actions to build the Docker container and push it to the registry, and then pull the new version to the app service via the Github Webapps Deploy action - just pass the image name in as the parameter, similar to (this example)[https://github.com/Azure/actions-workflow-samples/blob/master/AppService/docker-webapp-container-on-azure.yml]. I found this to be the best route after having issues getting the deployment working as a Python app service - mostly relating to the time taken to pull the dependencies in the App Service. A basic plan (B1) was sufficient - so the thing was around the same price overall as something like static web apps. App Services requires the server to run on port 8000.
Authentication
Using authentication was then built-in via App Services EasyAuth (and getting the login details is possible via Streamlit, although not formally supported) - see the snippet below:
import streamlit as st import base64 import json from streamlit.web.server.websocket_headers import _get_websocket_headers headers = _get_websocket_headers() auth = headers.get('X-Ms-Client-Principal') name = headers.get('X-Ms-Client-Principal-Name') claimsprincipal = json.loads(base64.b64decode(auth).decode('utf-8'))
The not so good
Styling is one of the main limitations to the toolkit. I wanted to add corporate branding to the frontend, but aside from a built-in theming system that allows you to change colours etc, it's hard to customise the look and feel. I followed a few articles that were on the web to do this, which used the st.markdown() call to inject a stylesheet into the UI, and customise the left navigation bar that way through adding content and overriding the built-in Streamlit styles, but it's not very ergonomic (as the original styling appears briefly before the stylesheet applies) and this is likely to be brittle if the CSS classes that Streamlit use change in the future. Nonetheless, it did the basics of getting corporate branding such as logos into the UI for me relatively quickly.
Other limitations are that multi-user and authorisation is not really supported, so it does limit the usefulness of the tool overall (to me anyway).
Conclusion
So, would I use Streamlit again - yes, probably, but it'd be a quick internal tool that would be used by engineers and not end users. It's probably the fastest way to get to a UI that I've tried so far, but the limitations are frustratingly limiting, so it'll probably stay as part of a niche in my toolbox - and I'll continue looking for alternatives and reaching for React and Svelte in the meantime.
Diving (back) into Python - Notebooks
21/09/2024
As I've continued to spend more time bringing myself up to speed with Python, I've found myself using notebooks more and more - but with a twist. I'm sure most readers are probably familiar with notebooks by now, but I don't recall them being around when I was using Python heavily last - from memory, iPython was around, but Jupyter hadn't been invented.
Notebooks
For the uninitiated, notebooks were originally pioneered by Mathematica, but came to prominence as part of the Jupyter project (originally iPython Notebook), here's an early description about it from Linux Weekly News (a site I thoroughly recommend by the way, and have been reading since I was in university, which is mumble years ago now).
They work by hosting code (in this case Python) in individual blocks called Cells, that can be executed individually, within an interpreter that keeps running (the Kernel). The code is run in the default Python namespace and so can be accessed as long as the Kernel is running as part of the session. The cells can contain markdown as well as running code. The output from the execution of the cells is written back into the notebook UI, so you can use it a bit like the REPL in Python (or F# or Lisp etc.) The output can be just text from the REPL (e.g. print statements), but can also be visualisations, array output displayed at HTML tables, etc.
This makes Jupyter notebooks into a literate programming environment, with all the affordances of the modern stack. Having the interim state saved and available for inspection can make interactive development a really pleasant experience - and you can document as you go in Markdown so you can express the why as well as the how the code works.
I first started using them in earnest in F# as they support the bottom up functional approach really well and they've become where I start most projects in Python or other languages - they're great for some exploratory programming when you're trying to get an idea up and running, the workflow is to build up short snippets of code in a cell, check it produces the right output, promote it to a function, then build up the program as you go in the same way. When you're in Python, you also have access to libraries such as Matplotlib and Pillow, and so you can see graphs or output of your scripts live in the output cells as you go.
The only real downside is that the structure of the notebook is just a JSON object and because output can change within runs, they don't play very well with Git as virtually any interaction will change the file, so commits can be dirty unless output is cleared.
Notebooks in VSCode
By default, the Jupyter environment is accessed via a browser, but my preferred method is via Visual Studio code (which is where most of my development happens outside of any C# that I happen to be doing). There are extensions available for both Jupyter Notebooks, and also .NET languages such as F# (plus others), called Polyglot Notebooks. The .NET one also supports SQL as a bonus, and can even share data between languages (so, for example, you can execute a query and then plot that in a JS library like ECharts). Another nice feature of Polyglot notebooks is that you can load nuget packages and DLL files from other projects by a magic command. The only real prerequisite is that you need to have Python and Jupyter (to run the kernel) installed for the Jupyter extension, the Polyglot one will require .NET 8.
Here's VSCode running a Python and F# notebook - in this instance, Python was probably the better experience as Matplotlib did the thing I wanted out of the box, whereas I had to try several F# libraries before I got something simple that ran locally.
How I use it
As I mentioned earlier, I start a lot of my projects using a notebook, building up short snippets of code that help me flesh out the idea, and then gradually make it more structured (for example, creating functions & modules in F#, or building up classes in Python), and using the markdown documentation to leave a trail of what I did and why, so that I can pick up the experiment a few days or weeks later when I have some free time. Once I've proved the concept then I'll use the nbconvert utility (or the 'Export notebook to script' command in VSCode) to generate a python or F# file that I can continue with generating a normal project. This approach really works for me and it's helped me work productively in fits and starts as time allows.
Bonus - Google Colab
The Google Colab service also uses Jupyer notebooks as its main interface, so I've been using it for some AI related experiments. It's a great place to start as there's a generous free tier of GPU enabled instances, so you have simple access to libraries such as the HuggingFace Transformers library, that makes using most open source large language models very simple to experiment with - more on this in a later post.