Category Archive : Company Execution

Good Developer, Bad Developer

I recently read Ben Horowitz’s piece on the importance of training people in startup companies. At the end of this article he put together a document “Good Product Manager, Bad Product Manager”. Here’s my spin on it: Good Developer, Bad Developer. Enjoy, I look forward to your comments!

Good Developer, Bad Developer

Good developer is an artist, a craftsman who enjoys the process of creation. Bad developer considers himself as a programmer, responsible for generating lines of code.

Good developer understands the problems of the customers. Bad developer understands only the technical problem at hand. Good developer does not define the why, but constantly strives to understand why. He’s responsible for the how, and still sees the big picture. Bad developer is focused on building classes and methods and configuration files, but does not get the big picture.

Good developer understands the complete architecture of the product. Bad developer knows only the components he’s written. Good developer fully understands the technologies that are used within the product. He understands what they are used for, and how they work internally.

Good developer is not afraid of new technologies but embraces them by quickly getting a grip. Bad developer only sticks to what he knows. His immediate reaction to any technical change is negative.

Good developer is constantly learning and improving his skills. Good developer reads technical articles, and finishes several technical books a year. Bad developer does not have time to learn. He’s always too busy with other stuff.

Good developer cares about the product quality. He is also very much concerned with the process quality. Good developer pushes himself to create bug-free code; bad developer leaves it to QA to find bugs to fix.

Good developer develops features which create value for customers. Bad developer completes tasks. Good developer will never claim the requirements are incomplete, and will make sure to fully understand the features he’s working on. Bad developer will wait until the finest details are available. To emphasize: good developer is the CEO of the feature – he’s going to make sure he always has the information needed to accomplish the feature, and in case information is missing he’ll make sure he gets it.

Good developer is not afraid to go into anyone’s code. Bad developer is afraid of others looking into his. Good developer understands that it shouldn’t take more time to write self-explanatory and well-documented code. Bad developer always needs to allocate extra time to document and simplify.

Good developer will never feel his code is good enough, and will always continue to clean and fix. Good developer always strives to create elegant solutions but understands that his job is to deliver value to customers. Bad developer thinks only about the elegance of his code and leave the job of delivering value to others.

Is that all? Did I miss anything or got some of these wrong? Feel free to chime in the comments below!

Guy Nirpaz
Co-Founder & CEO, Totango

Totango Analyzed Engagement and Optimized Sales with Over One Million Businesses

over half don't use their saas paid service

Today, I’m happy to announce that Totango has analyzed the customer engagement and optimized the sales and customer success interaction with more than one million prospects!

We sure learned a lot from our beta stage and this is the place to thank all our wonderful customers who were taking part of our beta stage.

Out of the massive data we’ve gathered, here are 4 main conclusions that could help sales and customer success teams to understand where to focus in order to increase revenues: from new sales, expansion sales and renewals.

  • Free trial users who are still active during day 3 of their trial were 4 times more likely to convert into paying users than the average customer

What can I do with that information?
SaaS sales teams could use this insight by focusing their time and close more deals

  • Active trial users who were contacted by a sales rep were 70% more likely to buy the paid service than those who weren’t

What can I do with that information?
This proves that timely and contextual engagement with prospects results in more sales

  • A full half of paid SaaS customers log in less than once a month or do not use their paid service at all. Another 19% is using their paid service less than once a week. Only 14% of paid customers use their service weekly and only 17% use it daily

PR-Usage-Frequency

What can I do with that information?
Have customer success team focus on the non-active paid users and the sales teams to focus on the frequent users to increase upsell

  • Most cancellations were preceded by a period of non-use

What can I do with that information?
SaaS customer success teams could use this insight to configure alerts for inactive users and to pro actively reach out to these customers and offer help

PR-Infographics

 

Click to Tweet this Infographics

Embed this infographic on your site

Lean Startups are Not About Learning

Lean startups are not about learning — or at least not just about it. Learning is not enough because translating what you’ve learned into value is just as important and often no less difficult.

Measure–Learn–Act is a key tenet of lean startups. We are big believers in lean. We run Lean startup Israel and frequently blog and talk about our experiences of running a lean company.

Measure-Learn-Act is also a core part of our company vision. We believe measuring, learning and acting on usage-data is the right way to build, scale and operate any SaaS and web-business — and we’re building Totango to help companies do just that.

In this post, we’ll talk about how we “eat our own dogfood” and use this principle, and the Totango product, to drive our own product’s evolution.

How we measure progress?

In his remarkable work on lean-startups, Eric Reis uses the following definition for measuring progress at a startup:

Definition of progress for a startup: validated learning about customers (read more)

We found that to be a limiting concept.

We discovered that when we defined learning as our core objective, we ended up spending too much energy on our own learning (running A/B tests, minimum viable versions to learn about market interest, etc.) and not enough on leveraging our learning to deliver value to end-users and customers.

In other words, learning is not enough and is not the end goal. The real test of a startup is whether it builds a service of value — one that people want to use and can’t live without.

So our definition of progress is:

Definition of progress for a startup: validated value delivered to customers

‘Value’ means we released a product or service enhancement that helps customers accomplish something better or faster. ‘Validated’ is that we have a way to quantify the value delivered, usually through a positive change to one of the key usage metrics we track.

If, and only if, we reach that point do we declare progress. All the rest are considered internal milestones along the way.

How we measure validated value to customers?

Here is a set of product-level metrics we monitor on an ongoing basis:

Each product or service improvement we undertake needs to ultimately manifest itself with an improvement to one of these metrics. For example, changes to our signup process need to yield an improvement in signup numbers; a better customer onboarding process should result in a higher and faster rate of activations.

Our most interesting metric is ‘Engaged Organizations’. Our goal here is to have a quantifiable way to determine if accounts that went through the signup and activation process are actually deriving value from the solution. We measure this by counting the number of days each user performed meaningful interactions with our solution.

‘Meaningful interactions’ are not generic: each service naturally has a different set (in our case it can be usage of our inbox capabilities or interacting with an activity stream).

They also change over time when new functionality is added or product pivots are made. The point is that they provide a solid way to validate if users are seeing value in our solution, which in turns helps us determine if changes we make have a positive impact.

Learning as a means to delivering value

Validated-value-delivered is the way we measure progress, but that doesn’t mean we don’t spend a lot of time trying to learn what customers want. In fact, we’ve found that is the only way to consistently be able to add value. Otherwise, you spend too much time on bad ideas and get overly invested in product directions that ultimately prove incorrect.

We try to build an MVP (minimum viable product) as the first step for every product idea or feature request we handle. If we get good validation on the need, we know it’s a good place to spend more time.

Summary

Forcing ourselves to scientifically validate our assumptions about customer needs not only helps us reach results faster, it frees us from the need to argue to death over the merits of certain product directions.Rather than argue about it, we find a quick way to validate things as a precursor to spending more time on them.

But we don’t stop there, and neither should you.Measure value delivered.

It’s the best way to keep yourself true to the core mission of creating value, and make sure your product is on track.

 

 

About TOTANGO:
TOTANGO analyzes in real time customer engagement and intention within SaaS applications to help you grow your business

 

Increase your sales revenue!
Try TOTANGO free for 30 days
sign-up1