From the moment we’re born, we’re ranked.
In the delivery room we get an Apgar score. Doctors give us percentile rankings for height and weight. Schools test us constantly. FICO scores measure our creditworthiness. But you’re ranked another way you’ve probably never heard of: Secret “marketing scores” gleaned from thousands of data points are being used by a wide range of companies to determine your health-insurance rates or whether you get a mortgage or a job.
“Marketing scores are being used for all sorts of reasons beyond marketing,” says Pam Dixon, executive director of the World Policy Forum, a San Diego-based advocacy group. “Scores in themselves aren’t bad, but secret scores present a problem.”
These are called “alternative consumer scores” and companies calculate them by amassing large amounts of data through a variety of sources. Like credit scores, they offer insight into consumer behavior about your bill-paying history or propensity to rack up debt. Unlike credit scores, which fall under protections offered by the Fair Credit Reporting Act or the Equal Credit Opportunity Act, these scores are not federally regulated and are built through purchases in retail stores, online shopping and surveys, loyalty cards as well as social-media posts and photographs.
Some of it is considered structured data like your age, sex, marital status and address, the kind of information you give a credit-card carrier or a bank. Increasingly, however, unstructured data, the things that can’t be so readily classified like photos and images, emails, and Facebook posts, are working their way into complicated analytics that will flag a merchant about your shoe fetish or a health insurer about your smoking habit. It’s a huge piece of the “big” in Big Data.
That means data collected on you, whether through surveys you took, tweets you shared or complaints you made to a call center, are intertwined and parsed through sophisticated algorithms to give government and businesses a deeper look at you. What comes out is a profile of you that could include such nuances as how likely it is that you will buy certain products, take medications on time or run a fraud ring.
Dixon, for one, worries about the domino effect of these jumbled scoring methods. If your health insurance costs are tied to your neighbor’s credit scores and you’re a pioneer in a gentrification project, do you get redlined? “We have to re-examine predictive analysis,” she says. “We can’t let discrimination sneak back in.”
“Scoring of America,” which Dixon co-authored with Robert Gellman, a longtime privacy and information policy consultant, reveals a large collection of scores used on Americans that most of us have never heard of, let alone dreamt of. The scores grade your behaviors as well as those of the people closest to you, including your neighbors, with a score that puts new meaning to having the nicest house on the block. The neighborhood score, for example, assesses the approximate credit capacity of the neighborhood, not individuals in it.
The Affordable Care Act, for example, has an “individual risk score” that is based on your demographic and health-status information. The primary source of that information is data already collected from employers and health plans. That then determines what you will pay for care under Obamacare because it allows insurance companies to spread the risk out.
Marketing experts will tell you that these scores are not nearly as invasive as you might think and amount to nothing more than perfecting a sales pitch to you — customizing it based on your preferences, a hallmark of Big Data and Big Content’s contribution to a merchant’s sales-and-marketing playground. What’s more, they claim this is what consumers consistently say they want: tailor-made information delivered where they want it and when they want it.
But the scoring track record is hardly perfect. The Federal Trade Commission’s 10-year study on credit scores released last year discovered that one in five consumers had errors on at least one of their three credit reports, which impacted the scores that dictate what kind of interest rates they will pay on a mortgage or whether they will get certain jobs.
Those scores are derived from some pretty specific credit-card and loan data in your credit report, which itself can be a nest of misinformation. But we get to look at that and make corrections if needed.
Imagine now that a report we can never see, incorrectly determines you’re a dog lover who never follows the doctor’s orders. You could end up with a bevy of dog-grooming coupons, higher health-insurance rates and be denied for a jumbo mortgage loan.
“The quality of data matters,” the reports says. “Errors in data used to make a score create a score that is not predictive. With thousands of factors, error rates and false readings become a big issue.”
If you can’t see the scores, how do you know if they’re accurate? “When other consumer scores enter the marketplace without transparency or the limits that apply to credit scoring, consumer benefits are much more uncertain and unfairness is more likely,” according to the report.
To that end, Dixon and Gellman are calling on Congress to force data miners and marketers to make this information transparent to consumers, much like our credit scores and reports are.
Sometimes, Cash is Still King
Until then, you can game the system to a certain degree by choosing cash over credit or debit cards for some purchases. “If you have something sensitive to purchase, pay in cash,” Dixon suggests. But if you’re buying something from, say, a sports retailer or equipment maker, use a credit card. “That assumes you’re athletic and may have a potential positive influence on the amount you pay for health care,” she adds.
Here’s a rundown of some of these alternative consumer scores and why they’re used, according to Scoring in America:
Churn scores — These are abundant and can predict when you might dump your bank or cellphone provider for a competitor. In-house scores are used by layering historic customer sales data with outside analytics to build custom scores.
Tenant score — These give a history of evictions and use credit bureau and other data to determine if you will be a trustworthy lessee. They do fall under federal privacy regulations.
Insurance scores — These also are under federal mandates because they use credit scores and credit information to determine the costs of auto and homeowners insurance. Insurance scores, however, differ from credit scores because many companies have their own algorithms.
Health scores — Seven years ago when Dixon and Gellman first did research on consumer scores, they found few health scores. Now, however, they have uncovered “significant and high-impact consumer health scores in use.” Health records held by health-care providers or insurers are subject to federal health privacy rules, but there is a growing record of health information that falls outside those parameters. The information you give to health and fitness clubs, cosmetic-medicine services, massage therapists, even transit companies is compiled.
“Consumers routinely disclose health information to companies that promise to provide coupons,” the report says. “Consumers rarely understand that companies can collect personal information that they can later sell.”
Frailty scores — These generally are for the elderly and are growing in importance as our population ages. They can point to the likelihood of patient post-operative surgical complications or readmission to a hospital as well as mortality, to within one year.
Peer-to-Peer Energy People Meter Score — You may have seen some form of this already in your gas or electric bill. It measures a residential customers’ energy consumption patterns and compares them to their neighbors, in hopes of inspiring better energy efficiency.
Social scores — These collect individual and household social engagement and not all are opaque. Klout, for example, will very transparently gauge overall social-media influence while Tweet Grader will analyze your Twitter account to help you step up your influence.
Law enforcement scores — A 2013 Rand report uncovered much about predictive policing and how law enforcement creates and uses risk scores to figure out who the bad guys are.
Homeland Security scores — We don’t know much about how these scores are collected or used, but we do know that airline passengers have been screened for some time now. Homeland Security collects data and links it with other sources to establish a risk score for each passenger. The Transportation Security Administration uses it too.
Fraud scores — These are used to help detect if you’re a victim of fraud or how likely you are to become one. In some cases, they can reduce fraud by 50%. But fraud scores are also used to figure out if you are the fraudster.
Predictive anti-fraud scores — The U.S. Postal Service uses predictive analytics to scale through more than 30 indicators to “flag and rank instances of suspicious activity.”
Casino-gaming propensity score — All bets are on that this one can predict gambling addictions based on online and in-person visits to casinos and gaming sites.
Jennifer Waters is a MarketWatch columnist based in Chicago. Follow her on Twitter @JenWatersMKW. This article originally appeared on MarketWatch.com and is reprinted by permission from Marketwatch.com, ©2014 Dow Jones & Co. Inc. All rights reserved.