Automotive UI Prototyping with Real Car Data
A hands-on tutorial on using ProtoPie, RemotiveCloud and Figma to bring car UIs to life with actual vehicle signals.

.jpg)
The need for prototypes that feel as if you’re actually moving
Nothing beats UX testing on ‘the real thing. ’ But when you’ve spent any time working in automotive HMI design, you know that testing your designs in an actual prototype car is a rare luxury.
If you’re lucky, you may have access to a buck setup for judging your UI visuals and ergonomics in a semi-realistic layout.
The lack of realism gets particularly tricky when user testing. As a researcher, you’ll have to do a lot of explaining to provide your test users with enough context to understand what they’re looking at (“Now imagine you’re driving on the highway at 120km/h.”).
RemotiveLabs has developed something that could alleviate this missing realism. Their tools provide streams of real vehicle data (telemetry, location, etc.), which can help automotive software developers test applications with realistic data in their simulators.
This allows for multiple ways to power simulations:
- Choose from a pre-made library of sample recording,
- Upload your own signal data files (e.g., BLF, ASC, or MF4)
- Or even connect an actual vehicle for a live data stream to test your HMI designs under real-world conditions.

Wouldn’t it be nice if we could test our interactive car UI prototypes with real car data?
The case for designers
These tools can also help designers:
- Test prototypes of instrument clusters with realistic data (-visualizations).
- Use vehicle data to bring infotainment mockups to life.
- Duct tape a touch screen into any car and feed live data into a prototype UI.
Being curious how easy it is for an automotive UX designer like myself to incorporate this into a real-world design prototype, I decided to try and integrate RemotiveLabs’ tools into a typical UI design workflow.
Our UI Experiment: Figma, Protopie, and RemotiveCloud

Today’s de facto UI design tool is Figma. Design teams and their stakeholders use it to collaboratively iterate on design proposals.
Ideally, these design iterations also include testing:
- Designers need to test their design proposals to evaluate the dynamic aspects of their concepts ‘in motion.’
- User researchers need to do user tests with actual users to test intuitiveness, learnability, and ergonomic factors.
Unfortunately, Figma’s prototyping features are a bit rudimentary and usually not sufficient for automotive purposes. So, for interactive prototyping, I prefer another tool: Protopie.

RemotiveCloud streaming vehicle data into an HMI prototype on a touch display
In the automotive world, Protopie is a popular prototyping platform because it allows the use of multiple simultaneous (touch) displays, sensors, and complex conditional logic.
We will need a Protopie license that includes Protopie Connect (an add-on to Protopie that allows prototypes to communicate with external devices and data sources).
Download ProtoPie Studio and sign up for ProtoPie Connect
Here’s the plan:

Import UI from Figma
To begin with, we’ll take the automotive dashboard UI from Figma and bring it into Protopie to add the interactivity.
Once we have the prototype set up in Protopie, we’ll make a connection with RemotiveLabs’ toolset and drive various objects in our UI with streams of vehicle data.
If we can figure out how to do this, the world’s our oyster: we can use real vehicle data as parameters for controlling any aspect of our user interface.
Setting the scene

We will be making an infotainment screen for a fictitious urban mini-mobility vehicle. It has a basic central touchscreen, which functions as both a simple infotainment screen and as the instrument cluster.
The vehicle is designed for short trips around town (typically < 30 minutes), so functionality-wise, we will keep the system simple:
- An instrument cluster with the basic information needed for driving the vehicle.
- Navigation functionality.
- A music section, which offers digital radio and a Bluetooth music connection to your phone.
Let’s go: setting up the demo project in Figma
In this experiment, we’ll be feeding real vehicle data into various aspects of the instrument cluster section of the screen.
To make this work, we don’t need to do anything special in Figma. We can set up the design as we always do. Needless to say, it will make everyone’s lives easier if all layers have meaningful names - but that’s good practice, anyway.
From Figma into Protopie
Getting your Figma frames into Protopie is literally easy as pie. The kind folk at Protopie provide a plugin for Figma, which lets you import frames from Figma and turn them into fully editable Protopie “scenes”.

Your Figma design will typically carry over 95% intact. The main things I have seen break in the conversion are:
- Gradients
- Background blurs (for that popular frosted glass effect)
- Image aspects, which can typically be easily corrected in Protopie
To bring gradients (e.g., for masks) over from Figma to Protopie, I recommend copying the gradient layers in Figma as SVG and pasting them in Protopie.
Download ProtoPie's Figma Import Plugin here: https://www.protopie.io/figma

Preparing the ProtoPie for real-time data input
The first thing we need to do is to make any layer or object that we want to manipulate ready to receive changes. We do this by selecting the objects and clicking the “Make editable” button for each of them.
Again, double-check that each layer has a meaningful name because you will need to find them back in a dropdown list.
Once all the layers you want to manipulate using real-time data are editable, it’s time to set up the RemotiveLabs data stream we want to use.
Setting up the RemotiveLabs tools

For my experiment, I’ve been using the free trial license of RemotiveLabs, which gives access to a number of trial recordings that provide dozens of different variables and signals to feed into our demo.
In the web environment, we can cherry-pick the signals we want to use in our prototype. And there’s no need to do it all at once; we can always come back later to add or remove signals from our selection.
I prefer to use VSS signals when they’re an option over OEM-specific signals. VSS stands for “Vehicle Signal Specification”, a standardization protocol for automotive data from Covesa. Read more about it here. The nice thing about VSS is that all signals have meaningful, descriptive names.
I like things that make life easier.
Sometimes, it takes a bit of searching for the right signal name, but usually, they are quite easy to identify thanks to their naming format:
- Vehicle.Speed
- Vehicle.Powertrain.TractionBattery.StateOfCharge.Displayed
- Vehicle.Powertrain.ElectricMotor.Power
- Vehicle.CurrentLocation.Latitude
- etc.
The signal names are important! We will need to refer to these in our prototype.
When we’re done we can look up the exact configuration command needed for running our prototype. We’ll paste this command in a terminal window later, so make sure you save it in a location where it can be easily retrieved every time we want to run our prototype.

Feeding the VSS signals into our ProtoPie
There’s one last thing we need to do in Protopie before we can run our prototype: make the Pie listen and respond to the VSS signals we selected from the RemotiveCloud stream.
This is a 3-step process in Protopie:
- Make a global variable, e.g., a numerical variable named “speed”
- Make a trigger that will respond when it receives a VSS signal from Protopie Connect. It will listen to the name we found in the data stream, e.g., “Vehicle.Speed”
- Add the response: We can put the value we received in a text layer we prepared for this purpose (or change the color of an object, make something [in]visible, etc.).
That’s all. Obviously you can use all of Protopie’s extensive logic features to add complex conditional functionality and rich visualisations, way beyond just changing a number on the screen.

Getting the prototype to run for the first time
We’re now only a few steps away from seeing our prototype run with real data!
1. Protopie Connect
Save your Pie (the Protopie file) on a local drive, so Protopie Connect can get access to it. Then, run Protopie Connect and locate your local prototype.
Press the little “Display” button next to your imported Pie to open up a Protopie demo window. This is the window our interactive prototype will run in.
2. Open the RemotiveCloud in a browser window
Go to the browser tab where we configured our data stream. Press Play to start sending data. You may need to restart the stream at some point if it runs out before we’re done (depending on the length of the selected stream).
3. Perform some command-line magic in a Terminal window
Next, open a terminal window and paste our command line string into it (assuming you have already installed RemotiveCLI. If not, this is a great time for that). RemotiveCLI will serve as our bridge between the data streaming from the RemotiveCloud and our Protopie demo.

When everything is set up correctly, it should automatically detect that both Protopie Connect and our RemotiveCloud data stream are in place: you should see all the relevant data points scrolling by in the Protopie Connect window.
The UI of your prototype should now be responding to all incoming signals!

Some debugging pointers: If it doesn’t work, check if Protopie Connect is receiving signals.
- Yes: The problem is in your ProtoPie prototype.
- No: Check if RemotiveCloud is (still) playing back and your RemotiveCLI is running properly (look for error messages in the terminal window).
Next Level Prototyping: maps, location, and video streams
RemotiveLabs’ vehicle data streams can include the vehicle’s location data (typically represented by two variables for latitude and longitude).
If you’re up for a bit of a challenge, you can use these data points for some advanced prototyping:
- Moving maps (which follow the recorded location data of the vehicle).
- Linked video streams (e.g., for simulating parking assists).
- Mixing with input from hardware peripherals, such as steering wheels.
To do this, you’ll have to use Protopie Connect’s advanced features: “Stage View” (to stack web views or live video feeds onto your prototype) and “Plugins” (for cool stuff such as integrating Arduino/Teensy, IFTTT, and game controllers).
Those are beyond the scope of this experiment, but it’s good to know it’s possible. And there are plenty of tutorials about these possibilities online.
Read more about ProtoPie Connect here: https://www.protopie.io/connect
Now that we know how to access RemotiveLabs data, we can combine all these inputs into highly sophisticated prototypes that will feel much more realistic than the Figma demos we used in the past.

RemotiveCloud supports location data and video footage, which can also be brought into your prototype
About the author
Bram Bos is an experienced UX and product strategist with a deep passion for bridging design and real-world implementation. With a background spanning HMI design, UX research, and rapid prototyping, Bram currently works with major automotive brands and contributes to pushing the boundaries of in-car user experiences.
Bram Bos originally published this blog post on Medium.