Open Design is a developer toolkit that allows you to read and display data from designs using code. Though Open Design was first released in early 2021, the technology behind it has been powering Avocode’s design handoff tool for six years. This is the story of how it all came to be.
Avocode’s humble beginnings
Back in 2014, there was a lot of pain involved with designers handing off UI designs to developers. Many designers were still using Photoshop and developers didn’t have a great way to get what they needed from those designs.
Avocode 1.0 was released to address those problems. At launch, it targeted teams using Photoshop and we started hearing that they saw their design → code process get faster and more accurate. It was a success!
But we were just getting started.
Taking it up a notch
After our launch, we listened closely to feedback from our customers. We also did some UX testing with our designer and developer friends in Prague to learn more about the pain points in the design handoff workflow.
In the middle of 2016, we turned this feedback into an action plan. We focused our entire engineering team on building two interconnected features:
- Open designs without the design tool installed (just drag and drop the file)
- Be able to visually display the full design (including panning and zooming in)
This was quite a lot to take on at once, but we were confident that we could deliver and it would’ve killed us not to at least give it a try.
The first task was building parsers that could extract data from Photoshop and Sketch files without relying on the design tool itself. These parsers would need to deconstruct the file and convert the contents to JSON.
So the team got to work cracking open Photoshop’s notoriously opaque binary format and Sketch’s SQLite database with binary plist blobs (this was before Sketch 43’s new format was released). After a lot of hard work and lots of trial and error, we had the designs converted into a readable JSON document that we called a “Source JSON”.
But we weren’t finished yet. If we had stopped here, the team working on the Avocode app would have to implement all of the app’s functions (measuring distances, extracting text, etc.) twice – once for Photoshop and once for Sketch. On top of that, if a design tool update was pushed out that modified the format, the app team would have to support both the old and new version of the format (since we can’t expect that every file will be saved in the newest version).
This sounded like a major headache, so we decided to go a step further. What we needed was a stable API between the parsing team and the app team. So we took the best ideas from the Photoshop and Sketch formats and created a spec for a new JSON-based design format.
This spec turned into Octopus 1.0.
Then the team built converters that mapped values from the Source JSON into the new Octopus format. Some values could be mapped directly and some needed to be converted (for example, coordinate systems were normalized).
The obvious advantage of this approach was that the app team could focus on building a great product without worrying about whether a design tool update would break a feature. They were building on something stable and reliable. Meanwhile, the parsing team could focus on testing new design tool updates and making sure that they still translated into Octopus.
With Octopus, we checked off the first task. The next one to tackle was being able to actually display the design to the user. Previously, we used our Photoshop plugin to export a bitmap of every layer and placed them in a grid and at the specified position. Since we could no longer rely on the original design tool, we needed a new approach for generating these bitmaps.
So we decided to stop thinking like a handoff tool and to start thinking like a design tool. Photoshop and Sketch both had their own rendering engines, and it made sense for us to build one as well. So the team set out to build Render, a rendering engine that could faithfully reproduce the design using nothing but the data in the Octopus format.
We knew beforehand that designers were (and still are!) very intentional with where they put pixels. If this project was going to be successful, we would have to make sure that Render’s output was really really close to the original design tool. To measure precision, we built an internal tool that generated a bitmap of each artboard using both Render and the original design tool. It then compared the differences between the two and highlighted problem areas for our developers to look into. Especially later in the process, text rendering turned out to be a challenging thing to get pixel-perfect.
After about a year of really hard work, we attained an average of 99% rendering precision. In addition, the whole thing was lightning fast.
In the summer of 2018, we launched Avocode 3.0 and billed it as the “world’s first truly cross-platform design handoff tool”. For the first time, users could drag and drop a Photoshop or Sketch file directly into Avocode (no software or plugins necessary) and in just a minute or two, they could see and inspect the design. This is what we had been working towards for the last 2 years!
Oh, by the way, we delivered support for Adobe XD and Figma in this update as well. Once we had the foundation of Octopus + Render to work with, adding support for new formats was far simpler than it used to be.
We rode on the coattails of the Avocode 3.0 launch for a few weeks, fixing bugs and watching how all of the new features were being used. One metric we paid special attention to was the amount of time it took between uploading a design and actually being able to see it. It was pretty fast for small Photoshop designs, but it could take a few minutes for a multi-artboard Sketch design.
In that processing workflow, we performed four steps:
- Avocode app uploads design to Avocode servers
- Parsers convert design to Source JSON and then convert it to Octopus
- Render generates bitmaps of every layer and saves them to the CDN
- Avocode app downloads every bitmap and places them on a grid
As we were thinking about optimizing steps 3 and 4, we had an idea. Instead of rendering every layer server-side and then having the app download all of the bitmaps, could we do the rendering directly in the browser?
We started exploring a cutting-edge technology called Emscripten that directly converted native C++ programs into code that could be run on the web. With this, we could just download the Octopus file in the app and progressively render tiles of the design, resulting in much faster load times. And, as the user zoomed in, the design could remain sharp instead of getting pixelated. These improvements would elevate the user experience, so we gave it a shot.
After learning the ropes of Emscripten, learning tons about the limitations of web technologies, and making some optimizations to Render, the first version of View was looking really good. In the spring of 2019, we launched Avocode 3.7 with these changes included. Designs were opening up to 3x faster and customers were loving that their designs remained sharp even when they zoomed in to 1000%. It was a success!
Introducing Open Design
We were really excited to have these core technologies – Octopus, Render, and View – finally integrated into Avocode. We even built free tools using these technologies to drive traffic to our site. The first of these, Photoshop → Sketch Converter, garnered a ton of praise from the community and saw lots of engagement.
From this and feedback from a pre-launch campaign, we realized that the core technologies we built could have use cases beyond just Avocode. If we externalized these and turned them into a product, would anyone use it? We were determined to find out.
In the middle of 2020, we started taking this initiative seriously. We started by creating a REST API to import designs and get Octopus back. Our engineers worked on cleaning up the Octopus format, creating TypeScript types, writing documentation, creating a new public-facing API, and more.
In January 2021, we launched Open Design. Our main goal was to learn more about how potential customers wanted to use this tech in their applications. We hopped on calls with around 40 different companies and kept hearing the same two things:
- Design → Code – though it means something different to everybody, it’s clear that people want to automate the design handoff process
- Design Import – creators of design-based applications want to import Figma, Sketch, XD, Photoshop, and Illustrator designs into their app
We also heard that the raw API endpoints were an obstacle to getting started. That’s why, a few weeks ago, we launched Open Design SDK, which is a Node.js library for interacting with Open Design. We also included Render so that anyone can export high-quality bitmaps and vectors from their designs.
We’re still listening closely to learn more about different use cases as well as how to make it easier to adopt Open Design into your toolchain. We’re excited to finally put the technology that we’ve spent years working on into the hands of people that can really use it. We believe that by making design more open and accessible, we can help make design tools smarter and teams work together better.
Do these challenges sound interesting to you? If so, we’d love for you to join the team.