Should designers code?
Yes, next question.
A more interesting question is: what type of code should designers write?
In my almost decade-long career as a designer, you might think that my primary tools are visual—Figma, MIRO, or even pencil and paper. Or maybe you might think my previous background as a researcher might make my primary tools research tools like Dovetail or OptimalWorkshop. At times, you'd be right on both fronts.
However, one tool people often overlook as a potential benefit to the designer's kit is Python. I've lightly played around with Python over the past ten years, and in this blog post, I'll show you a few ways I've used it to streamline my design workflow or enhance my ability to solve gnarly design problems.
Reasons I've Used Python Recently
Survey Analysis & Data Visualization
Integrating Python into my workflow has transformed my design workflows to make data-informed decisions, particularly in survey analysis and data visualization. Python's scalability has been a game-changer, allowing me to process and analyze large datasets from user surveys efficiently. I can quickly clean up messy data by leveraging libraries such as Pandas, ensuring that I base my insights on accurate information. Moreover, the natural language processing capabilities of libraries like Gensim, NLTK, and TextBlob have opened up new avenues for understanding unstructured data, enabling me to extract meaningful insights from open-text responses that are easy to overlook.
WECMS – Skills Mapping
For instance, I've used Matplotlib to create a comprehensive visualization of my team's skillsets, identifying areas where we could improve our capabilities and highlighting individuals who could support their colleagues in specific domains.
In Google Sheets, combining data was a pain because I had to manually set it up for every single thing I wanted to do. But with Python, I could write code that could handle different setups. Using Python meant I could use the same code in multiple ways to show how one person's skills fit with their growth plan or to show the skills of the whole team.
In this sample team, Sloan is our resident expert on visual design, so the team goes to her for feedback on whether a design aligns with our product’s visual identity.
DCLX - 2024 Event Survey Analysis
As Vice President of the DC Lindy Exchange, a non-profit focused on educating about swing dancing and jazz music, I annually dissect their post-event feedback survey data. My mission? To uncover actionable insights that would elevate our future events. Leveraging my Python skills, I can quickly get deep into the survey data, using libraries like Pandas for data manipulation, Matplotlib for vivid visualizations, and NLTK for natural language processing.
Our routine survey analyses focus on creating data visualizations for multiple-choice or single-select survey questions.
For our 2024 event, while our overall ratings for each area of the event were impressively high, I didn't stop there in my analysis. I employed sentiment analysis techniques to get to the heart of improvement areas. This analysis allowed us to pinpoint the nuanced negative feedback that might have otherwise slipped through the cracks.
Using Python, I created a sorted list of survey feedback with sentiment scores.
The result? A clear roadmap for targeted enhancements from 2024 to 2025 that promised to transform good events into unforgettable experiences. This project showcased my technical skills and ability to translate complex data into strategic design decisions – a crucial skill for any design leader looking to make a real impact.
Medicare – Website Feedback Survey Analysis
One of my most impactful projects over the past year has involved redesigning key pages of Medicare.gov, primarily focusing on the journey of individuals approaching Medicare eligibility. This project was a critical initiative for CMS, as it aimed to simplify the often complex process of getting started with Medicare for millions of new beneficiaries each year when they are often rushed into making health decisions that might not be best for them. Using Python and libraries like Pandas and Matplotlib, I analyzed user feedback and behavior data from the existing website to identify pain points and areas for improvement. By applying natural language processing techniques, including sentiment analysis with TextBlob and NLTK, we could categorize feedback based on sentiment, identifying patterns in negative responses that we would have missed in a manual review. Additionally, we employed frequency analysis to determine which themes and issues appeared most often in the survey responses. This dual approach allowed us to uncover the emotional tone and the prevalent topics within the user feedback. These insights directly informed our redesign strategy, leading to a more intuitive, step-by-step guidance system that helped users understand their coverage options and enrollment timelines. The redesigned pages improved user satisfaction and increased successful completion rates for the Medicare enrollment process. This project exemplifies how leveraging Python for comprehensive data analysis can directly inform design decisions, resulting in tangible improvements to critical government services.
Content Analysis using BeautifulSoup
While Python libraries like NLTK and Gensim are powerful tools for natural language processing and topic modeling, BeautifulSoup has been a game-changer in my content analysis workflow. This versatile library excels at web scraping, so I can quickly extract and manipulate web content. When combined with other libraries like Requests for accessing web pages and Pandas for data organization, BeautifulSoup becomes an indispensable tool for comprehensive web content analysis.
This is my dog, Soup. She is quite beautiful, but definitely not what we’re talking about here.
LifeBridge Health
In a 350-page website migration project, I used BeautifulSoup to scrape and extract content and assets. This automated process significantly streamlined content preparation for editorial revision, saving countless hours of manual work and ensuring the team reviewed and updated all content in the migration process.
When supporting LifeBridge Health with its website redesign and platform migration, I used BeautifulSoup to download content from all site pages and paste it into Google Docs for content teams to review, revise, and map content to available components. We also used it to create an asset library of images, PDFs, and locally hosted videos that would allow us to determine which assets (if any) could be deprecated.
Medicare Publications
During my time as design lead for the Medicare.gov static site, we identified numerous PDF publications scattered across their website as static content; I leveraged BeautifulSoup to gather a comprehensive list of these documents. The script I developed located these PDFs and extracted relevant metadata before transforming the resulting data into a structured spreadsheet, ready for import into Drupal as a custom 'publications' content type. This process turned a potential logistical nightmare into a smooth, automated workflow.
[side-by-side screenshot of old Mgov page & resulting spreadsheet; put an arrow between the two]
CMS Navigation Audit
Most recently, I used BeautifulSoup to extract all navigation links from CMS.gov in a project focused on optimizing website navigation. We could measure each navigation item's performance by pairing this data with analytics on user clicks. This data-driven approach allowed us to make informed decisions about menu structure and link placement, ultimately enhancing user experience and site usability.
These projects highlight how BeautifulSoup can help us creatively solve various content-related challenges. Whether migrating large volumes of content, cataloging scattered resources, or analyzing site structure, BeautifulSoup provides the flexibility and power to efficiently handle complex content analysis tasks.
I've created comprehensive content analysis pipelines by integrating BeautifulSoup with other Python libraries, such as Pandas for data manipulation and Matplotlib for visualization. These save time and provide more profound, actionable insights into web content structure and performance.
Leveraging Google PageSpeed Insights API for Performance Analysis
The Google PageSpeed Insights API is an invaluable tool in my quest to measure and improve website performance. By combining this API with Python's Requests library, I've created scripts that automate the collection of crucial performance metrics for desktop and mobile platforms.
One of my most impactful projects involved analyzing a large-scale website with over 350 pages. I developed a Python script that interfaced with the PageSpeed Insights API to gather performance, accessibility, and SEO scores for each page across desktop and mobile versions. The script automatically updated a spreadsheet with these scores, providing a comprehensive overview of the site's technical health.
This automated approach saved countless hours of manual testing and offered immediate insights into areas needing improvement. By visualizing the data in a spreadsheet, we could quickly identify patterns and prioritize our optimization efforts. The project proved instrumental in guiding our technical discovery work, allowing us to focus on the most critical issues and track improvements over time. This data-driven method ensured that our performance optimization strategy was targeted and effective, leading to a better user experience across the entire website.
In another project, we’ve used the results from Google Lighthouse tests to help prioritize our work during ongoing roadmapping activities. When we saw low accessibility and SEO scores, we worked to improve them on the identified pages over other pages that might be working just fine for now.
Bringing Ideas to Life: Prototyping with Python and Django
I've leveraged Python and Django to create fully functional prototypes and websites throughout my career, bridging the gap between design concepts and tangible user experiences. This approach has allowed me to rapidly iterate on ideas and demonstrate complex functionalities in a practical, interactive manner.
One of my favorite prototyping techniques involves using Django views to gather and display data. This method has proven invaluable in creating dynamic, data-driven prototypes that closely mimic real-world applications. For instance, I developed a text analytics tool demo using Django, incorporating sentiment analysis and frequency analysis capabilities. This prototype showcased the potential of natural language processing and provided a platform for storing and analyzing uploaded content.
My prototyping work often extends beyond standalone applications. In one project, I used Python to transform a Google Sheet into a JSON file, which I then uploaded to a public GitHub repository. With a publicly available JSON file, I populated an event schedule using JavaScript, demonstrating how different technologies can be combined to create efficient, scalable solutions.
The power of Python in my workflow isn't limited to prototypes. I've built several fully functional websites using Django and headless CMS platforms like Contentful. My design portfolio and the website for "West of MLK," a podcast about living in Southwest Baltimore, are prime examples of this approach. These projects showcase how prototyping skills can evolve into production-ready solutions.
I've also explored integrating various databasing tools into my prototypes and websites. Using platforms like Notion and Airtable, I've created dynamic applications such as a personalized reading list and a visitor guide for Baltimore. The visitor guide is particularly noteworthy as it displays recommendations from an Airtable base and allows for user-generated content, enabling visitors to add their own recommendations and reviews.
Here’s a screenshot from my Baltimore visitor guide. It shows the listing
Through these projects, I've demonstrated how Python and Django can be powerful tools for service designers. They enable us to create interactive, data-driven prototypes and websites that bring our ideas to life and provide tangible value to users.
A brief service blueprint of how I worked with Lindy Focus to dynamically update their spreadsheet using a Google Sheet, two Python scripts, and some custom JavaScript in their Squarespace instance.
Getting Started with Python as a Designer
So, you're ready to dive into the world of Python? Awesome! Don't worry if you're feeling overwhelmed – we've all been there. The good news is that Python is known for its readability and simplicity, making it an excellent choice for designers venturing into coding.
Think: Learning Resources
If you’re ready to get started, but don’t know where to go, I’ve got you covered. There are plenty of resources out there that can teach you how to begin coding with Python:
- Codecademy's Python Course: Perfect for absolute beginners, this interactive course will guide you through the basics step-by-step.
- Pillow: For image processing
- Matplotlib: For creating charts and graphs
- BeautifulSoup: For web scraping (great for gathering inspiration!)