Measuring the quality of PeerReach data

The PeerReach algorithm is great at detecting someone’s expertise in a topic as well as calculating scores that measure a person’s influence. We are constantly monitoring and updating our algorithms to improve the quality of our data.

One of the ways to measure the quality of our algorithm is to benchmark it against the data held by our competitors. We compared a randomly generated data sample to establish the topics associated with those users.

Comparison 1: Benchmarking the topics

People are able to be influential and authoritative in more than one topic, such as Webtech, Music or Sports. Therefore, we needed to find out how accurately we could identify topics against the competition.

To do this, we subscribed to the Spritzer Twitter API (every 100th tweet), and on one October day, we took the first 21 accounts with 1000 or more followers*.

We determined the topic(s) in PeerReach, Klout, and PeerIndex. The results showed that PeerReach determined 48% of the Topics correctly, while our competitors were less accurate:

Profile comparison PeerReach competitors

Comparison 2: Benchmarking the Scores

Since influence is a very subjective metric, we took a different approach to compare our influence metric against our competitors. We constructed two lists of people in the Webtech industry and ordered them by metric A and B. We asked people from the Tech industry to compare the lists, but we didn’t reveal to our testers which list was ordered by Klout and which list was ordered by PeerReach (PeerIndex was excluded from this blind test).

Our 102 testers decided the best influence list was:

Blind list comparison PeerReach competitors

*Our aselect sample consisted of: @Justsh3r, @Iighthouse, @FoodTruckFreak, @FreshieAce, @ZacCryder, @itsmica13, @_xxCharlieB, @ErikaS_TWxx, @ChristinaSNP, @GrandeBiebsArmy, @Samie_dior, @TraderPlanetm, @ _WayneRooney_, @BrianneTV, @ZanMfStacks, @IMBeanz, @InspiredWalk, @basssam22, @Pizzahutlanka, @AirCanada, @RomeoRam