What Gets Measured Gets Managed: How an App Development Agency Improves

3Advance began as a one-man roadshow, selling mobile app development for delicious half-smokes! Our initial success was built on a tiny team pulling off miracles regularly. But miracles don’t scale, and the past few years brought many changes – we grew in size, and we matured in process. We’re a long way from tracking all the KPIs (key progress indicators) we want to, but here’s why and how we started. 

I first came across this proverb when reading The 4-Hour Body by Tim Ferris. There it was attributed to Peter Drucker an Austrian-born American management consultant. (Upon further research, there appears to be some debate as to whether or not Drucker coined it or merely brought it to the masses.)

Anyhow, ever since I read that phrase, it’s been ingrained in my psyche, and I’ve tried to implement it in various ways. In my personal life, this took the form of purchasing a scale (a smart one, of course). The scale I bought makes sadistic use of binary emojis: displaying a smiley face when I lose or maintain my weight and a sad face when I gain weight. I find that if I get on the scale every couple of days, I’m more inclined to watch what I eat. In fact, I typically game the system by only getting on the scale post workout :-).

Measuring and Managing in App Development

It wasn’t until sometime later I heard the phrase in a business context (ironic because Peter Drucker received the Presidential Medal of Freedom in 2002 for his contributions to the business world). Since then, I’ve seen it in a few forms:

  • What gets measured gets done.
  • What doesn’t get measured doesn’t get managed
  • If you can measure it, you can manage it.
  • If you can’t measure it, you can’t improve it.

Since I realized the business applications, I’ve tried to make this a mantra at 3Advance. Like most digital agencies managing creative people, we’ve struggled at times to:

  • measure productivity,
  • instill the discipline of recording time,
  • get a quantitative progress and budget check on a given project, and
  • schedule new work based on when the team can finish current commitments.

The scale, in this case, was a little trickier. We now use several different tools, including agile development methodology, kanban boards, software for tracking time against project budget, software for forecasting future commitments, and a healthy dose of behavioral therapy.

Measuring an App Project Success

After seeing some success in project management, we decided to try to measure more than just process and budgets. At a project level, we began measuring project success. To do that, we try to come up with metrics that we can benchmark.

For instance, recently a client came to us with large binders of application forms. He explained that on a monthly basis, users had been reporting back via Excel and Word docs, and communication was by email and telephone. We were charged with bringing this system online. As a side note: even though many people assume that bringing an offline system online will automatically make it more efficient, that’s not always true. In fact, sometimes it’s the opposite. Excel is a mighty foe for the application developer.

But back to our client. We conducted interviews with the two main user groups, the administrators of the system and the end users. These interviews gave us some metrics to benchmark against. The two items we picked were (1) number emails sent a week and (2) man hours spent. By picking measurable benchmarks (instead of the vague “bring the system online” directive), we created a win-win situation all around: our team had a better focus and our client looked like a hero when he reported to his superiors the measurable benefits of the new system (the number emails the system reduced and man hours saved).

Startups don’t necessarily have something to benchmark against in terms of processes or competitors. In those cases, we often focus on the onboarding flow, meaning how long it takes to perform an action (payment) from the point of registration. Another common metric we quantify, especially for a B2C, is handling the load. A business may have projected that it plans to serve x customers by a given date. We can simulate this with load testing and then measure performance at 100; 1,000; 100,000; and 1,000,000 users.

Measuring After the Mobile App Goes Live

After we launch a web or mobile application, we measure several metrics that help us gauge the health of the system. Part of our default monthly SLA is measuring the performance of a given system. In our digital agency, this includes the following four metrics:

  1. User Analytics
    Using tools like Google Analytics, we’re able to report user numbers, like the number of active users and average session time.
  2. Error/Crash reports
    Most programmers have heard the quote, “If debugging is the process of removing bugs, then programming must be the process of putting them in.” As hard as we work to minimize bugs, users always find a way. To solve this problem, we use tools that log errors across all platforms (mobile, web, server). These tools enable us to be responsive, and in a lot of cases, fix something before the user has even had a chance to report it.
  3. Response time
    This means different things to different clients, but in general, we’re measuring page loads, API server response times, database query responses, and the response time of any third party APIs we may be using.
  4. Support tickets tagged “bug”
    We provide all clients with access to a ticketing system to report any issues once the application goes to beta mode. Ultimately, we’re looking to see the cliff graph here—from a large volume of issues during beta period to a large drop post-launch.

These metrics help us ensure that our projects continue to be successful long after we’ve finished the initial development process.

Don’t Measure Everything

We’ve talked a lot measuring things at our development company so we can improve them. But it’s also important to remember that there certainly is such a thing as too much data. I recently read that about 90% of the data stored in the world has been created in the last two years. Just because we can does not necessarily mean we should.

So it is important to note the distinction between actionable and vanity metrics. An actionable metric is something that you have control over; by pulling a lever here, you can change what happens.  A vanity metric is something that looks good on paper but is outside of your control. We’re constantly reviewing what exactly it is that we measure. At our development company, we ask the question: Given this information, is it something we’re going to act on? If the answer to that question is no, maybe it’s something we shouldn’t be measuring.

Conclusion

Whether in your personal or your professional life, measuring the right things can help you improve. At 3Advance, we’ve found success as we’ve added measurement benchmarks to project management, budgets, process, project success, and post-launch tracking. Maybe your app development or completely different business could benefit from some or all of these benchmarks, too. Good luck!

3Advance is an app development company in Washington DC that helps startups, non-profits, and other businesses turn great ideas into beautiful, simple mobile and desktop apps. If you’d like to learn more about how we can help you create a better way forward for your company, drop us a line. We’d love to hear from you.