Tashfeen Ahmed - Portfolio
Senior Product Designer at Microsoft. Designing experiences for human-AI interfaces, writing assistance, and data visualization.
Hi, I'm Tashfeen Ahmed Sr. Product Designer at Microsoft
I design experiences for human-AI interfaces, solutions for writing assistance and use data viz as a medium to present my findings.
Explore
- Get to know me
- My work
- Resume
- Reach out
Get to know me
Experienced design leader, creating transformative experiences for AI
transformative experiences
I'm a Senior Product Designer at Microsoft, shaping the future of digital experiences with Microsoft 365 Copilot.
With a Master's in Design Informatics from the University of Edinburgh, I've honed my skills in understanding and applying the nuances of human-computer interaction and data science to meet and exceed user expectations. Before joining Microsoft, I contributed my expertise to MathWorks, focusing on core MATLAB functionality, and dived into the dynamics of the real estate market in the Middle East at Bayut.
Current Practice
At Microsoft, I'm at the intersection of AI and design, navigating the uncharted waters of designing for systems powered by transformative technologies like LLMs and other advanced AI models. This unique vantage point allows me to craft experiences that are not just functional but are also at the cutting edge of what's possible, balancing user needs with the potent capabilities of AI.
My design ethos is rooted in the belief that technology should empower, not overpower, enhancing human experiences without overshadowing them. It's about striking that perfect harmony between innovation and intuition, ensuring every digital interaction feels natural, meaningful, and distinctly human.
As I continue to explore this exciting confluence of AI and design, I am driven by the endless possibilities of creating solutions that are not just effective but are also ethical and inclusive, reflecting a deep understanding of the diverse tapestry of users we serve.
Experiences
Click an image to scramble it into a puzzle
Kind words
Click a card to shuffle • Drag to rearrange
My work
- Microsoft 365 Copilot
- User Evaluation
- KeepTruckin
- Makerarm
- Fidget Digits
- COVID in Pixels
- pyReality
- MIDI Controller
- Expressive Lamp
Project
Microsoft Copilot
Designing human-AI interaction patterns for Microsoft 365 Copilot, helping millions of users create and comprehend content.
DESIGNING FOR HUMAN-AI INTERACTION
The new era of creativity with Microsoft Word
During a pivotal platform shift driven by AI, I was among the first designers to integrate the transformer-based (GPT) language models into Microsoft's ecosystem (mainly Word), used by millions of people worldwide. My work on Copilot in Word helps users draft, summarize, rewrite, and even chat about documents, unlocking productivity and creativity. By designing new patterns and interactions, I redefined how people create and edit content, offering tailored suggestions and proactive text transformations—all within the familiar Word environment.
AGENTIC AI IN OFFICE
Agent Mode: True AI intelligence woven into Word, Excel and PowerPoint
Agent Mode represents the next leap in Copilot: giving users superpowers through agentic loops and reasoning models that create, analyze, and transform content on the fly. I was deeply involved in the early design exploration, prototyping interactions that shaped how these autonomous capabilities would feel in Word, Excel, and PowerPoint. By rapidly iterating on concepts and sharing findings with the broader team, I helped establish the direction and informed key research decisions, ensuring Agent Mode feels intuitive while unlocking genuinely magical workflows.
Project
User Evaluation
An AI-native platform for customer understanding at scale, featuring transcription, insights, and multimodal AI chat.
DESIGNING FOR INSIGHTS
Crafting User Evaluation's AI Platform
At User Evaluation, I steered the design process from early concepts to a full-fledged AI-native platform, tackling customer understanding at scale. Beyond UX, I also played a product management role, shaping the vision and requirements for features like robust transcription, AI insights, and multimodal AI chat. By aligning every feature—from AI-generated reports to advanced visualization—with business objectives, I helped position User Evaluation as a leading solution for AI-driven customer research.
Project
Makerarm
A complete digital fabrication system for makers: 3D print, laser, carve, plot, assemble, pick/place + more on your desktop!
DIGITAL FABRICATION DESIGN
How might we create an intuitive user experience for complex digital fabrication?
Leading the design of Makerarm, a groundbreaking desktop fabrication system, I focused on making complex manufacturing processes accessible and user-friendly. The project successfully raised over $435,000 on Kickstarter and garnered significant attention for its innovative approach to democratizing manufacturing technology.
Project Details
- My Role
- Design Lead
- UX Designer
- Software Architect
- Tools & Methods
- User Research
- Prototyping
- Interface Design
- Full-stack Development
- Team
- Tashfeen Ahmed
- Azam Shahani
- Yasir Sheikh
- Duration
- 1.5 years
Challenge
The primary challenge was creating a unified user experience that could seamlessly handle multiple manufacturing processes. We needed to design an interface that would make complex fabrication accessible to makers while maintaining professional-grade capabilities.
Solution
I led the development of Makerarm.io, a browser-based control interface that revolutionised how users interact with manufacturing equipment. The software features real-time 3D visualization, intuitive controls, and smart tool detection, making complex fabrication processes accessible to both novices and experts. • Adaptive Interface: Smart tool detection system that automatically adjusts the interface based on the attached fabrication head • Wireless Control: Cloud-based architecture enabling remote control from any device, anywhere • Teaching Mode: Interactive system allowing users to train the arm through demonstration
Impact
The project exceeded its funding goal by 25%, raising over $435,000 on Kickstarter. The interface design received particular praise, with TechCrunch highlighting its ability to "make complex fabrication accessible to anyone." The software architecture we developed became a blueprint for future desktop manufacturing systems.
Key Learnings
Leading this project taught me the importance of balancing innovation with usability. While we pushed the boundaries of what's possible in desktop fabrication, we never lost sight of our core goal: making manufacturing technology accessible to everyone. The project's success demonstrated that complex technical capabilities can be delivered through simple, intuitive interfaces when user experience is prioritized throughout the development process.
Project
Fidget Digits
A self-tracking device to track fidgeting behaviour during online lectures.
SELF-TRACKING & BEHAVIOUR VISUALIZATION
How might we enable students to self-track and visualise fidgeting behaviour?
This project explores self-tracking, annotation, visualisation, and regulation of fidgeting behaviour during online lectures. Fidget Digits, a self-tracking device, adds 'physical margins' around the structured digital workspace students work in. Secondary interactions such as fidgeting extrinsically enhance a user's state toward the completion of their primary tasks.
Project Details
- My Role
- Project Management
- Ideation
- User Research
- Product Design
- Interaction Design
- Tools & Methods
- Adobe XD
- Rhino 3D
- KeyShot
- 3D Printing
- Arduino Nano
- Team
- Dr. Maria Wolters
- Tashfeen Ahmed
- Jiangnan Xu
- Sharon Li
- Peize Li
- Duration
- 3 months
Problem
Fidgeting means making small movements, especially of the hands and feet, through nervousness or impatience, usually unconsciously. Research in psychology shows that fidgeting behaviour reflects a person's attentional state. The research also suggests that fidgeting increases over time as the attention level decreases. During online lectures, it is hard to concentrate due to several reasons. There is a dilemma surrounding the benefits and detrimental impacts of fidgeting. Schools have also banned fidget spinners in the past. Research shows that in some cases, fidgeting can actually be a way of regulating attention, and hence beneficial. However, this is heavily dependent on context.
User Research
Through semi-structured interviews with students (N=10) from The University of Edinburgh, we found that fidgeting and doodling are common behaviours during online classes, with increased occurrence in longer lectures. Some students actively use doodling as a method of attention regulation. Students prefer a handheld fidgeting tracker rather than a wearable device.
Solution
Based on the user research, we sketched some concepts and discussed them with the target audience. Feedback from users was taken at all steps during the design process. The finalised designs were modelled on Rhinoceros 3D software and 3D printed for evaluation. Fidget Digits is equipped with a speed measuring sensor that captures the number of spins i.e. a single spin equals one data reading. The data is recorded with a timestamp so that it can be visualised on a time series. Users can annotate data to make better sense of it. Alternatively, it can be synced with the calendar to auto-annotate timings.
Data Visualisation & Annotation
Data from the Fidget Digits device is stored locally on the device on an SD card. It can then be transferred to another device for visualization and annotation. The goal of this project is to not only let users visualize data but also be able to annotate it. Annotation can happen automatically if the calendar is synced with the applet or users can manually select fidgeting events and annotate them. The data visualisation applet keeps a diary of readings and annotations.
Results
Our aim was to design a device that does not distract users but helps them become more self-aware about their fidgeting behaviour. Our initial prototype focused on functionality over ergonomics. Through user evaluation and feedback, we revised the design to create a more user-friendly and ergonomic device that helps students become more self-aware of their fidgeting behaviour without causing distraction. The device successfully tracks fidgeting patterns and provides meaningful insights through visualization.
Project
pyReality
A Python library for creating immersive mixed reality data visualizations directly within Jupyter Notebooks.
MIXED REALITY DATA VISUALIZATION
How might we make mixed reality visualizations more accessible and interactive for data analysis?
pyReality is a Python library developed as part of my master's dissertation project for the Design Informatics programme at the University of Edinburgh. It enables users to create mixed reality data visualizations directly within Jupyter Notebooks, supporting interactive 3D visualizations through WebXR. Designed for use with head-mounted displays (HMDs), pyReality allows users to explore and analyze data in an immersive and intuitive way.
Project Details
- My Role
- Researcher
- Developer
- Designer
- Tools & Methods
- Magic Leap
- Python
- Jupyter Notebooks
- WebXR
- Babylon.js
- VRIA
- Team
- Dr. Benjamin Bach
- Tashfeen Ahmed
- Duration
- 6 months
Problem
Data visualizations often lack interactivity and immersion, limiting their ability to convey insights effectively. Traditional tools require multiple platforms for modeling, rendering, and viewing, making the process time-consuming and fragmented. Researchers and data analysts need a way to quickly prototype and explore data in three dimensions, but existing solutions require switching between multiple tools and don't integrate well with common data analysis workflows.
Solution
pyReality streamlines the process of creating mixed reality visualizations by integrating data modeling and visualization into a single tool. It leverages Jupyter Notebooks and WebXR for rapid prototyping and real-time interaction. With a focus on rapid prototyping, pyReality enables users to develop, model, and render visualizations directly within Jupyter Notebooks, eliminating the need to switch between multiple tools or platforms.
Key Features
pyReality offers immersive 3D scatterplots that allow users to visualise datasets with spatial depth, enhanced by augmented reality (AR) components rendered using Babylon.js. It also supports 3D bar charts, leveraging VRIA for highly customizable visualizations that can be configured through Python code or graphical user interface (GUI) controls. This flexibility allows both programmatic control for automation and interactive manipulation for exploratory analysis.
Results
pyReality demonstrated the effectiveness of mixed reality visualizations in enhancing data comprehension. Early testers highlighted its accessibility and speed for prototyping complex datasets in AR and VR formats. The library successfully bridges the gap between traditional data analysis workflows and immersive visualization, making mixed reality data exploration accessible to researchers and analysts who are already familiar with Python and Jupyter Notebooks.
Project
MIDI Controller
A custom Arduino-based MIDI controller designed to enable three-deck mixing for deep house DJ performances.
HARDWARE DESIGN & DEVELOPMENT
How might we design a custom MIDI controller to enable seamless three-deck mixing?
I designed and built a custom Arduino-based MIDI controller to expand my DJ setup and enable a third track in my mixes. This personal project combined my passion for deep house music and my desire to learn Arduino programming. The controller seamlessly integrates with Rekordbox DJ software, adding layers to my performances and unlocking creative possibilities.
Project Details
- My Role
- Designer
- Developer
- Researcher
- Tools & Methods
- Arduino IDE
- Arduino C++ Programming
- MIDI Protocol
- Hairless MIDI
- Digital/Analog I/O
- PWM Control
- Rekordbox Integration
- Team
- Tashfeen Ahmed
- Duration
- 3 months
Background
As a deep house DJ, I often felt limited by my two-channel Pioneer DJ controller, which restricted me to mixing just two tracks at a time. Adding a third track in real-time required switching between layers, which disrupted the flow of my sets. Inspired by my prior experiments with analog synthesisers and coded music sequences, I set out to build a custom MIDI controller that would integrate seamlessly with Rekordbox and offer dedicated controls for effects. Beyond supporting my performances, this project was an opportunity to deepen my knowledge of Arduino programming and MIDI protocols.
Process
1. Prototyping the Circuit I began by designing the internal circuit using an Arduino Uno, six potentiometers, six LED indicators, and six push buttons. Each component was carefully selected to handle specific controls like volume, EQ, and effects toggles. 2. Programming MIDI Signals Using the Arduino IDE, I wrote firmware to translate button presses and knob rotations into MIDI signals that the Rekordbox software could understand. I integrated Hairless MIDI as a bridge to connect the hardware via USB.
Physical Design & Testing
3. Physical Design For durability and aesthetics, I built the casing from wood, ensuring it was spacious enough to house the circuitry while remaining compact for portability. The layout was optimized for quick, intuitive use during performances. 4. Testing & Refinement I tested the device extensively during rehearsals, fine-tuning mappings and response times to ensure seamless integration with Rekordbox. Adjustments included refining LED feedback and improving button responsiveness for real-time use.
Result
This project achieved its goal of adding a third deck to my DJ setup, enabling richer and more dynamic mixes. The MIDI controller's intuitive layout allowed me to trigger effects seamlessly during performances. Beyond improving my DJ sets, this project deepened my expertise in Arduino programming, MIDI protocols, and hardware integration. While initially designed for Rekordbox, I later experimented with mapping controls in Ableton, discovering even greater flexibility for studio use.
Project
Expressive light lamp
A non-invasive, subtle design to guide physical distancing in shared student spaces.
Expressive Light Lamp Design
How might we help students maintain physical distancing in shared spaces?
During the COVID-19 pandemic, social distancing emerged as a crucial measure to prevent the spread of the virus. Initially, people were required to quarantine, but as restrictions eased and spaces reopened, maintaining physical distance became the new norm. This project was developed as part of my studies in the Design Informatics programme at the University of Edinburgh to address this challenge. We used expressive lights to convey the message of physical distancing in shared student spaces. The design was non-authoritative, non-invasive, and subtle, making students aware of their physical distance from others.
Project Details
- My Role
- Researcher
- Designer
- Maker
- Tools & Methods
- Surveys
- Secondary Research
- Product Design
- Adobe XD
- Rhino 3D
- Arduino Nano
- Video Editing
- Team
- Dr. Bettina Nissen
- Tashfeen Ahmed
- Tamara Lottering
- Chloe Lei
- Duration
- 3 months
Problem & Solution
Problem The COVID-19 pandemic changed daily life, emphasizing the importance of physical distancing to reduce the spread of viruses. Shared spaces in universities posed challenges as they often promote social closeness.
Solution We designed the expressive light lamp as a subtle and effective physical distancing aid for shared study spaces. Combining interior design aesthetics with sensing technologies, the lamp monitors proximity and conveys feedback through light signals, maintaining comfort without feeling intrusive.
Results
Usability testing showed the lamp effectively influenced behavior in maintaining distances, but some limitations, like habituation, were observed. Future iterations could focus on improving adaptability to avoid over-familiarity. The expressive light lamp features subtle color changes to indicate safe or unsafe physical distances, providing real-time feedback through Arduino Nano sensors for precise distance detection. Its non-invasive design blends seamlessly into study spaces, maintaining functionality without resembling surveillance tools.
Project
Motive / KeepTruckin
A modern developer platform enabling seamless API integrations for fleet management solutions.
DEVELOPER EXPERIENCE DESIGN
How might we create a seamless developer experience for fleet management integrations?
I designed KeepTruckin's modern developer platform, empowering partners, developers, and customers to access the company's powerful APIs and tailored fleet management solutions. Through comprehensive user research and industry analysis, I crafted an experience that enables developers to independently build, test, and publish applications to the marketplace. KeepTruckin has since rebranded as Motive.
Project Details
- My Role
- Lead Product Designer
- UX Researcher
- Design System
- Tools & Methods
- User Research
- Prototyping
- Developer Interviews
- API Documentation Design
- Team
- Tashfeen Ahmed
- Awais Imran
- Haseeb Jamil
- Duration
- 1 year
Challenge
KeepTruckin needed a platform that would enable developers to create and configure custom applications, test their integrations with our APIs, and publish apps directly to the marketplace—all without requiring constant support from the API team. My challenge was designing a seamless, self-service developer portal that could scale alongside the existing App Marketplace, enabling secure and efficient third-party app integration into fleet management workflows.
Understanding the System
Before moving forward, I investigated how the new Developer Platform would impact existing workflows in the App Marketplace. To manage complexity and ensure incremental improvements, I structured the project in multiple phases: First, I mapped the current environment, exploring how developers interact with the Marketplace and APIs. Then, using high-level user flows, I visualized the new end-to-end journey from app creation and testing to final publication. To truly understand our user base, I conducted interviews with developers, identified their pain points, and validated assumptions through regular feedback sessions.
Approach
I collaborated closely with product managers and engineers, involving them early in the UX process to maintain alignment on technical feasibility and user needs. My exploration of Developer Experience (DX) included deep dives into books, case studies, and podcasts about best practices. These insights guided critical decisions on navigation, documentation style, and testing interfaces. We began with Sketch and Abstract but transitioned to Figma to enable broader team collaboration as it became our new standard.
Solution
KeepTruckin's modern developer platform empowers users to integrate their applications with our APIs through intuitive design and comprehensive documentation. Key features include: • Comprehensive API Documentation: Accelerates developer understanding and reduces time to market • Self-Service Workflow: Enables developers to register apps, generate API keys, and deploy integrations without extensive support • Enhanced Security & Privacy: Provides administrators granular control over data-sharing permissions • Scalable App Marketplace: Expands the ecosystem with innovative third-party integrations
Impact & Learnings
We designed a clean, user-centric interface where developers can navigate documentation, manage multiple applications, and monitor real-time usage. By reducing friction, this platform optimizes productivity and nurtures a thriving community of creative solutions. This project reinforced the importance of deeply understanding the developer mindset—emphasizing documentation clarity, flexible sandbox environments, and streamlined workflows. Moving forward, I continue to challenge myself and others: How can we reduce onboarding to mere minutes while ensuring continuous developer delight?
Project
Covid in Pixels
This project analyzes global news to show how COVID-19 shifted humanitarian reporting. It tracks changes in focus on vaccines, medical response, aid, and lockdowns.
DATA ANALYSIS & VISUALIZATION
How might we measure the impact of COVID-19 on humanitarian news worldwide?
We used Natural Language Processing (NLP) to analyze humanitarian news articles from Euro-Atlantic Countries, Gulf Donors, and New Global Media Players from December 2019 to August 2020. This analysis revealed insights into the directionality of humanitarian aid, key topics, and national approaches to pandemic management.
Project Details
- My Role
- Researcher
- Data Analyst
- Designer
- Developer
- Tools & Methods
- Python for NLP
- Topic Modelling
- TF-IDF Analysis
- HighCharts.js
- AmCharts
- Team
- Dr. Maria Wolters
- Tashfeen Ahmed
- Tamara Lottering
- Xiaohang Xu
- Minjia Zhao
- Jin Mu
- Duration
- 3 months
Methodology
Our analysis began with data cleaning and preprocessing, including normalization, stemming, and stopword processing. We utilised trigrams for collocation analysis, which proved more effective than bigrams in revealing contextual term usage. At the time of this project, GPT models were not popularly being used for text analysis. We relied on traditional NLP techniques for data analysis and visualization, which provided robust insights into the humanitarian discourse patterns.
Analysis & Visualisation
We employed LSA and LDA topic modeling alongside TF-IDF analysis to understand topic evolution over time. The exploratory data analysis revealed interesting patterns in news discourse subjectivity, particularly between US and UK sources from 2010 to 2020. One of my favourite ways of finding patterns in the data was through concordance, which is a list of all occurrences of a phrase in a text with its immediate context. This technique allowed us to understand how specific terms were used across different media sources and time periods.
Results
Using Highcharts.js and AmCharts, we created interactive visualizations that effectively communicate key insights from the data. The project website features a comprehensive timeline of news articles alongside dynamic line and bar graphs, providing an intuitive interface for exploring media discourse patterns during the pandemic. The analysis revealed significant differences in how various countries and media outlets framed humanitarian discourse, particularly in the directionality of aid and the evolution of key topics throughout the pandemic's early months.
Contact
hello@tashfeen.me
Feel free to write a sticky note and I'll get it.