Department of the Interior Bison fenced in angle bracketsOpen Data, Design, & Development at the Office of Natural Resources Revenue

Publishing federal sales data: how we used design studios and user testing to develop a more user-centered dataset

November 12, 2024

Design studios establish project requirements, define the problem, and explore many solutions. When the Open Data, Design, and Development (ODDD) team took on the challenging project of adding federal sales data to the revenuedata.doi.gov (NRRD) site, we decided it was necessary to use design studios. Design studios would help us to develop the best possible design and content solutions on a short timeline.

In the past, we limited design studio participants to ODDD team members. For this project, we included subject matter experts (SMEs) throughout the Office of Natural Resources Revenue (ONRR) for 2 reasons:

  • Our team wanted to increase user and stakeholder participation in our design work;
  • SMEs had the in-depth content knowledge we needed, as well as ideas of how customers might use this data.

Participatory design encourages product teams to include users in every step of the design process. Due to project restrictions, we were not able to include external users in the design studios. However, some of our SMEs were also users of this federal sales dataset and NRRD, so we were able to include some actual users in our design studios. After the design studios, we performed user research on the prototypes and webpages within our development site. This allowed us to gather user feedback and make impactful updates. Now that the dataset is public, we plan to include external users in future research studies.

Background

This project started with requirements from outside of our agency. ONRR received a recommendation from the Office of the Inspector General (OIG) to “Develop and implement a means of communicating the oil and gas effective royalty rates to stakeholders and decision makers on an ongoing basis.” Due to the short project timeline, we were unable to perform discovery research. This research would have helped us assess the demand for including effective royalty rate data on our public sites. However, in past user research studies, participants requested federal sales data on NRRD. The federal sales dataset includes the effective royalty rate. Publishing the federal sales dataset would meet the OIG recommendation and user needs.

Design Studios

We adapted our design studio process to meet the needs and requirements of the OIG recommendation. We start design studios with a session where we “define appetite.” Appetite is our interest and capacity to take on the project. Because the OIG recommendation required us to publish federal sales data, defining appetite was not an effective use of our time. We already knew we had to complete a specific task under a deadline.

Instead, we started with recruiting SMEs. The ONRR team that owns the dataset chose these SMEs. Our design studio group consisted of 7 SMEs from 3 different ONRR teams and our ODDD team of 6.

Over the course of 5 sessions, we established requirements, created and reviewed low-fidelity prototypes, hole-poked, and refined high-fidelity prototypes. Our two ODDD designers facilitated the sessions and organized offline work. We emphasized building comfort and confidence with the SMEs. We began the studios with the mindset that everyone is a designer, but we acknowledged that people who weren’t formally trained in design find it difficult to see themselves as designers. To help them see themselves as designers, we were very flexible on what we considered a prototype. We used different facilitation methods that ensured participants had space to give ideas and feedback. We also asked them to send us feedback outside of the sessions. This allowed the SMEs to process the information and give feedback at their own pace.

Here’s a breakdown of what took place during and between each design studio session:

Studio 1

We introduced the SMEs to the design studio process and expectations. Each participant shared their backgrounds, experiences with the dataset, and project concerns. This sharing helped participants build understanding and empathy for each other. We compiled an agreed upon list of requirements. As a group we toured NRRD and the various sections where sales data could be added. We also explained the low-fidelity (lo-fi) prototyping assignment and provided examples.

Offline work:

  • Participants created lo-fi prototypes and drafts for charts, graphs, text content, and process diagrams.
  • Participants produced prototypes in any way they were most comfortable. This included pencil sketches, PowerPoint charts, spreadsheets, and annotated screenshots.

Prototype examples from Studio 1’s offline work: A prototype for ONRR's compliance activites with two diagrams. The diagram on the left shows nested circles where upfront system edits in the largest circle, then data mining in the second largest, compliance reviews in the third largest circle, and audits in the smallest circle. The diagram on the right shows a circular process, where the top step is payors submit royalty data to eCommerce then goes to automated validation "upfront edits" flags potential error in eCommerce then goes to ONRR's analysts flagged errors and notify payors to make corrections then goes to ONRR processes royalty payments then goes to payors make adjustments to previous royalty payments, and then this last box circles back to the top step.

A prototype with a map of annual Royalty Value Less Allowances (RVLA) all commodities 2017. This shows a map of the united states with several western states in a pale green color. The green color indicates that it has a value of RVLA. There are also notes to the right of the map: the map has annual data that can change the year a commodity; show the royalty value less allowances but if you hover over then it could show allowances and ERR; you could select individual states and offshore areas like you do for other data sets and the cards could show a further breakdown of information including a state ranking of revenue.

Studio 2

We shared the lo-fi prototypes to the group. Each participant presented their prototype and other participants asked questions.

Offline work:

  • ODDD facilitators developed a survey to gather feedback on the prototypes. In design studios, participants provide feedback in real time. Since most of this group was new to design studios, we changed this studio’s format to fit the participants’ needs for more anonymous feedback. Only the designers knew who submitted each survey, so that they could ask follow-up questions as needed.
  • Participants gave feedback on the prototypes via survey.

Studio 3

We reviewed the feedback survey results and discussed the top ideas for each section, using the requirements list from Studio 1. We also discussed possible solutions that were realistic with our resources and timeline. This is also known as “hole-poking.” The group agreed on the ideas and lo-fi prototypes that the designers would develop into high-fidelity (hi-fi) prototypes.

Offline work:

  • Designers created interactive, functional hi-fi prototypes. They divided up the prototyping so each designer was working on different pages.

Prototype example from Studio 3’s offline work: A high-fidelity prototype of a diagram that explains the process of data development. This diagram is located in the download data page and shows a heading, "How we developed this data" with four numbered circles. Circles 1 through 3 are purple, which indicate ONRR 2014 form reporting. Circle 4 is green, which indicates OIG recommendation. Circle 1 has the title, "Deduct Royalty Relief and Quality Bank Adjustments from Sales Value". Circle 2 has the title, "Apply Contract Royalty Rate". Circle 3 has the title, "Deduct Allowances from Royalty Value". Circle 4 has the title, "Calculate the Effective Royalty Rate".

Studio 4

We split Studio 4 into two sessions to allow plenty of presenting and feedback time for the hi-fi prototypes. Hosting two sessions prevented burnout that could have happened during one long session. The group continued hole-poking with these hi-fi prototypes.

Offline work:

  • Designers continued to refine prototypes. They gathered more feedback on the prototypes via surveys and email. Key ONRR stakeholders, who were not involved in the design studios, provided feedback as well. They helped ensure that we were meeting leadership requirements. It also kept them informed and engaged with the work that went into the project.
  • We updated participants on the updated prototypes and pages developed.
  • We shared user research completed on the query and download data pages for this dataset.

User testing

We used design studios to identify problems and agree on solutions. User testing helped us find usability issues and incorporate user feedback into the final product.

We recruited participants outside of the design studio team for user interviews. These participants were unaware of the OIG recommendation. They were also unfamiliar with internal processes related to this data. We limited our participants to U.S. Department of Interior (DOI) employees because this dataset was new and unfinalized. We wanted to recruit people that used ONRR data but do not produce ONRR datasets.

Round 1

This round included usability testing for prototypes of the download data and query pages. The ODDD team held a total of 13 user interviews with each lasting about 45 minutes. The participants provided valuable information, including 32 recommendations for improvement. An example of a recommendation for the query tool was to switch position of Sales Value and Sales Volume columns. This change would make the dataset reflect the ONRR 2014 form. An example of a recommendation for the download data page was to clarify the equation. This change would help users understand the progression of how reporters calculate royalties. ODDD implemented several of these recommendations immediately to improve the product.

Our team held more user testing in Round 2. This round of testing parsed through the information-dense explore data page.

Round 2

This round included hallway testing for the explore data page. The explore data page has several interactive components, including a map, graphics, and tables. Three ONRR employees participated in sessions that were approximately 10 minutes long. The hallway testing resulted in 4 recommendations for the explore data page. An example of a recommendation for the explore page is to add a title for the federal sales map. This change would provide the user with information of the values represented on the map after a user chose the federal sales filter. Incorporating this new dataset and its recommendation is very time consuming because the page has many components. Our team is waiting until we bring on an additional designer to do further research and implement designs.

Publishing the federal sales data

Through the design studios and user testing, our team published federal sales data 7 weeks before the deadline set by ONRR leadership. This initial roll-out included:

A screenshot of the query tool with federal sales data on the NRRD site. The top section includes the different filters for the interactive data set, which have selected Federal Sales, Calendar Year, 2019 through 2023, all commodities, all land types, and all state and offshore regions. The data displayed is grouped by commodity and has a row for each year.

A screenshot of the download data page in the federal sales section on the NRRD site. The first step is reporters submit royalty reporting, the second step is ONRR aggregates sales data, and the third step is calculation of effective royalty rate. All three steps give the readers an option to learn more with a drop down arrow.

A screenshot of the what's new box on the NRRD site with information about the new dataset, federal sales. The what's new box includes a section dedicated to the new dataset with links directly to the download data page and query tool.

Next steps for federal sales data

  • Build explore data page for federal sales data.
  • Perform usability testing on explore data using a beta site, incorporate recommendations, and publish the page.
  • Perform user research with external stakeholders and incorporate recommendations.

Lessons learned

The team worked together to explore lessons learned. We did this to improve our design process going forward. This project’s lessons learned include:

  • Identify roles on the ODDD team for the design studio and user testing parts of this project. This includes roles, such as a project manager, interview, recruitment, and analysis.
  • For complex projects, create a project plan for project managers to communicate the required level of effort. This includes defining each task, the timeline of each task, and prioritization of the project in comparison to other work.
  • Provide basic notes on decisions made and action items to team and participants in design studios. This could be in a running list of notes sent to entire design group after each meeting. These notes will remind participants of decisions made and action items.
  • Continue to plan projects that move the team towards participatory design. This project was the first time the team included people outside of ODDD in design studios. These stakeholders enhanced our understanding of this dataset and the end user’s needs.

Note : Reference in this blog to any specific commercial product, process, or service, is for the information and convenience of the public, and does not constitute endorsement, recommendation, or favoring by the Department of the Interior.

Erin Elzi profile image

Erin Elzi: User Experience Designer at the Office of Natural Resources Revenue.

Alexandra McNally profile image

Alexandra McNally: User Experience Designer at the Office of Natural Resources Revenue.