Grain is an app for macOS and web that lets you record and share the best parts of your Zoom video calls. Take notes during a live call, and instantly turn your notes into sharable clips and highlight reels.
I worked with the co-founders and a small team of 12 to turn Grain’s thesis into a full product. I helped define Grain’s audience, prototyped and tested product ideas, and designed the first version of the product.
Take notes during a live Zoom call.
Create and edit clips on grain.co.
Share clips on Notion, Slack, or anywhere that supports embedding.
Starting out
When I started this project, Grain had set their focus on improving a particular aspect of team communication. Namely, the process of capturing the content of a live video call and sharing that content with those who need it.
We decided to target the types of people who are often responsible for this process - a user type we call the “Extractor.” This user type describes a process shared by a variety of people from journalists to recruiters to user researchers.
Some examples of meeting types where extraction is the primary mode of information transfer.
Discovery
We conducted a series of interviews with people who fit our newly defined user type to further understand their workflows, and discover how Grain might be able to help.
Research questions
How do they currently "extract" information from a meeting?
Why do they take notes at all?
What role does video play in this process, if any?
Key findings
Most people are summarizing meetings by taking notes, which are shared with others.
They are incentivized to do it because others rely on their notes or summaries.
They feel compelled to include video clips, but often don’t because it’s too time consuming.
With a better understanding of our new user type, we identified an opportunity where we could help reduce the amount of time they spend summarizing meetings while also improving the quality and effectiveness of their output.
Ideation
Myself and David, the other designer, came up with various ideas and discussed them with the broader team. These ideas were mostly centered around the note-taking process, since all value flows downstream from there. This process started with rough ideas sketched on whiteboards and notebooks, which quickly became more concrete user flows and low-fidelity UI as the team reached consensus.
Early concept for note-taking experience.
The team reviews and discusses early concepts.
One of the user flows that came out of the ideation phase.
Grain's design model - One of the main outcomes of the ideation phase.
User Research
We ran a series of evaluative tests that were aimed mostly at making sure we were building the right product for the right people. First, we showed them static mocks, and walked them through. After a few iterations, we designed and built a prototype and asked people to try it.
Evaluative test with static mocks.
Second round of evaluative testing, this time with a functioning prototype.
Key feature: note-taking
From our evaluative testing, we learned that people preferred the familiar experience of a word processor, and that showing timestamps were an important indicator that Grain and Zoom are synchronized.
One idea I had, which users showed interest in, was something we called Reactions. With this feature, users could use a simple shortcut to save a clip at any time during a live call. People liked this idea because oftentimes they have themes they’re looking for during interviews that stay fixed from meeting to meeting. These customizable emojis can be used to represent common concepts that are shared across meetings. That, combined with the timestamp reduces the amount of work required from the user.
Start typing to create a timestamp, click or use a shortcut to add a reaction
Key feature: clips
When the meeting is over, users are directed to grain.co where they can start creating clips. According to our early research, this was the most time consuming part of our user’s current workflow, so it was essential that clip creation was as quick and easy as possible. Clips can be created from any note. If a reaction was used during the call, those are automatically turned into clips.
Create clips by pressing the green "+" button. Click on a clip to play it.
When it comes to editing clips, we tried a few common patterns like the examples below, but found that users were spending too much time tweaking the clip duration to make sure they were trimming the right content. Ultimately, we took some inspiration from an app called Descript and used the transcript provided by Zoom’s API as a way of trimming the content. This helped save users time, and interestingly could be done without audio or much transcript accuracy.
Two rejected options for clip trimming.
Drag the in and out points to select the portion of the clip you want.
Clips can be shared with a direct link, or embedded on Slack, Notion, iMessage, Twitter, or anywhere that allows embedding.
Clips can be shared anywhere that supports embedding.
Key feature: Highlight Reels
From our initial user workflow interviews, we learned that people often share notes or documents that summarize their meetings, and they also have the desire to include video with these notes. We added a feature that we called Highlight Reels, which allows users to press a button on the recording page that automatically generates a highlight reel of all clips on the current page.
Clips are strung together into a highlight reel by pressing "Present" on the recording page.
Process
Throughout our process, we had bi-weekly reviews as we prepared for the initial release. I worked closely with engineers (half of which were remote) to ensure things were built to spec. While we designed and tested many features, only a portion of them made it into the initial release. The excluded portions would be built over the next 12 months, assuming all goes well.
Prototyping
I made prototypes to test interactions, and to give engineers a working example of what they’re building, along with detailed specs. Prototyping was an important part of the process since the app had many intricate interactions, like the video players for example, that were difficult to explain with static images alone. I used both After Effects and Origami, depending on the types of interactions involved.
Prototype for the clip player made in Origami.
Polish
I created and documented a simple design system that sped up our process overall, allowing us to spend more time solving product problems, rather than UI. The React components built by the engineering team used this design system as a basis for their own system.
Launch & Reflections
A beta version of Grain was released in late April 2020. So far the feedback has been largely positive, and the team continues to refine the app as more users sign up.
I’m happy I was able to help Grain move from a simple idea to a fully fledged product. Overall, I think the company now has a clearer path ahead of them, and are much better equipped to handle the challenges that lie ahead.