Poor data quality is a disease which slowly destroys the value of your marketing database. Quality is damaged through incomplete information, poor data entry, and data decay. A traditional response is to purchase new records, but this only provides a temporary (and expensive) respite from your data quality issues.
The data I’ve seen indicates that contacts decay at a 25 to 30 percent annual rate. This means that a prospect list that is 90 percent accurate today will be little more than 50% accurate two years later. Thus, a prospect list purchase strategy is like steroids, it makes your marketing database look healthier on the day the list is purchased, but it simply masks the growing disease within your database. Treating one or two symptoms does not address the underlying problem — a lack of a broad, continuous data strategy.
However, if you take a holistic view around data quality which includes continuous DaaS validation, ABM look-a-likes, web form enrichment, lead-to-account mapping, duplicate management, data standardization, and reference database appends, you will have a healthy database that ensures your MAP and CRM platforms contain the richest, most accurate data.
Vendors that support holistic data quality include ReachForce, D&B Optimizer (FKA Workbench), Zoominfo, InsideView, Oceanos, and Openprise. So if you are concerned about your ability to target, segment, pass quality leads to sales, score leads, or build predictive models, then begin with a holistic data strategy. Symptoms of poor data quality include high email bounce rates, declining email sender scores, returned direct mail, duplicate records, incomplete records, accelerating unsubscribe rates, and sales reps that ignore your marketing qualified leads.
Any firm that is adopting ABM, advanced lead scoring, a single view of the customer, or predictive analytics, should begin with a holistic data quality strategy. Otherwise, these advanced marketing strategies are bound to fail.
Information is a key asset on the battlefield which provides a competitive advantage to the side with better information and communication systems. While the “Fog of War” continues to be an issue, real-time information sharing helps improve military decision making and reduces the risk of both collateral damage and friendly fire accidents. Nevertheless, information remains imperfect and mistakes continue to happen.
In the corporate world, there is also a fog of corporate battle, but much of it is self-induced. We build systems that don’t talk to each other or which use different conventions for standardizing information and identifying customers and contacts. Furthermore, information is not validated and enriched as it is obtained, resulting in weak information sets.
While this lack of data synchronization creates headaches across the company, I will be focusing on sales and marketing platforms for purposes of brevity. Inaccurate and incomplete marketing information causes problems within marketing platforms such as weak segmentation, poor scoring, bad targeting, and misallocated marketing resources. Bad and missing fields are then propagated to downstream systems. If information is bad when received and there are no mechanisms for validating, standardizing, and enriching the information in its system of origin, misinformation flows to other platforms resulting in an increasingly expensive set of problems and remediation costs. It is much easier and less expensive to resolve a problem at its source.
Furthermore, once leads are enriched with firmographic and biographical details, the intelligence is available to downstream platforms; and if the enrichment includes a company identifier (e.g. European Registration Number, Ticker, D-U-N-S Number), then maintaining data accuracy in downstream systems and linking the platforms is much easier.
The cost of islands of information is high for B2B firms. A few examples:
Marketing departments generate a broad set of leads through multiple channels and systems. Some of this information is anonymous and some is tied directly to contacts. How confident are you that you aren’t generating duplicate (or triplicate or quadruplicate…) information? Are you matching and enriching information as it is gathered from web forms, uploaded tradeshow spreadsheets, and purchased lists, or are you loading data “as is” with little verification or enhancement? By focusing on data quality at the outset, you are ensuring that richer and more accurate information is shared across your platforms.
Marketing invests large sums in generating marketing qualified leads (MQL) which are then frequently ignored or cherry picked by sales. Some of this disconnect is a lack of agreement on what constitutes a good lead but some is also a lack of front-end intelligence being applied by marketing. A lead may be considered as marketing qualified but lack key information to pass muster with sales (e.g. how big is the company? What industry are they in? What is the job function and level of the contact? What technologies do they use? Will they be approved by credit after I’ve invested months in landing the deal?) Knowing that a lead downloaded a whitepaper earlier in the day signifies interest in a topic, but not the ability or authority to make purchasing decisions. Furthermore, it provides a thin reed upon which to base a sales conversation.
Channel conflicts are introduced when bad or missing information results in a lead being directed to the wrong sales rep. Leads which lack accurate firmographics and linkage information are likely to be routed to the wrong team or rep. Thus, a lead generated at a subsidiary or branch location of a major firm may be routed to a territory rep instead of a named account rep, resulting in both channel conflict and a greater likelihood that the lead will be ignored. Of course, if the lead was poorly routed, it also is likely that the lead was improperly scored and assigned to the wrong segments for targeting and analytics.
Finally, the lack of standards and cross-platform communication make it difficult to obtain a unified view of the customer. An October 2015 survey of global executives by Forbes Insights found that 63% believed that a more complete/unified view of the customer would result in more accurate predictions of customer needs and desires. Other benefits included improved customer experience/service (60%), greater feedback for product/service innovation (55%), and a greater ability to target and optimize for specific customers (50%).
For decades, technology strategists have warned about the problems of creating data islands across one’s IT platforms. If systems are unable to speak with each other or data lacks consistency across systems, then it is impossible to develop a holistic view of one’s business and customers. And while the problem seems large today, it will only grow in scope with the advent of the Internet of Things. So if you think the fog of corporate battle is difficult in 2017, failing to address it will only make the problem many-fold more difficult to tackle in the years to come.
As we are one month away from the new year, it is a good time to think about budgeting for data quality in 2018.
I know it isn’t glamorous, but that doesn’t mean it is unnecessary.
Data Quality software is markedly improved over the past few years. No longer is it necessary to download and forward a file to a vendor and wait for them to process your marketing file. Sales and Marketing Operations can now setup automated cloud cleansing that works within Marketo, Eloqua, Salesforce, Microsoft Dynamics, and other enterprise applications. B2B vendors to consider include Dun & Bradstreet, InsideView, Zoominfo, and ReachForce.
These platforms perform both initial batch match & append and ongoing enrichment, ensuring that your sales and marketing files have both accurate and complete data. These services also support company and contact prospecting, data health reports, suppression lists, and segmentation reporting. A few even offer free data quality reports, deduplication, technographic enrichment, nixie files (defunct companies and departed exec files), web form support, sales intelligence services, and contact verification and standardization (e.g. address, phone, and email) for non-matched records.
As these services reside in the cloud and offer cloud connectors for the major MAPs and CRMs, the operational overhead is minimal allowing operations to focus on ABM look-a-likes, segmentation, and improved targeting instead of file management.
What’s more, data quality improvements benefit sales, marketing, and downstream systems. A record cleansed and verified as it is created costs much less than a bad record passed down to other enterprise platforms. Beyond direct cost reduction (storing bad data, marketing to departed execs, sales calls to abandoned voicemails, reduced time keying and updating records manually), there are improvements to segmentation, targeting, lead scoring, lead routing, and messaging.
So budget for data quality in 2018. It isn’t glamorous, but it is effective.
Openprise launched a Data Marketplace to assist with ingesting and normalizing third-party B2B and B2C data. Amongst the platforms supported are Salesforce, Marketo, Eloqua, and Pardot. The Data Marketplace, part of the Openprise Data Orchestration platform, includes built-in rules to ensure data is properly onboarded. Users can set primary, secondary, and tertiary providers with multi-vendor data normalization rules.
“We’re excited to make ZoomInfo’s 210 million businesspeople and 11 million businesses available on the Openprise Data Marketplace,” said Phil Garlick, VP Corporate Development at ZoomInfo. “Openprise’s data cleansing and unification capabilities, combined with ZoomInfo’s data accuracy, provides marketing and sales teams with an unparalleled solution to run more effective campaigns.”
Other B2B Partners include InsideView, Orb Intelligence, Synthio (FKA Social123), and Dun & Bradstreet. Additional vendors are in the final certification stages. Openprise claims that new data providers can be setup in minutes.
Customers can extend pre-existing vendor contracts or take advantage of pre-negotiated discounts.
“Earlier this year, we surveyed 175 marketing professionals to identify data marketplace trends and published our findings in the B2B Data Market Industry Report,” said CEO Ed King. “We found that companies that worked with multiple data providers were much more likely to be satisfied with their third-party data, but those same companies expressed how much they struggled with pulling multiple providers’ data into their marketing and sales system of record while maintaining a consistent set of standards. Openprise Data Marketplaces solves this problem.”
The B2B Data Market Industry Report also asked which vendors were being deployed. The survey of 175 B2B marketers at firms with at least 200 employees found the top three vendors were Zoominfo (40%), InfoUSA (36%), and Data.com (35%). Surprisingly Sales Genie matched D&B/Hoovers amongst all of the surveyed marketers and exceeded it amongst enterprises. InfoUSA rates were likely higher than the other firms as it offers both business and consumer data while Dun & Bradstreet/Hoovers and many of the other vendors offer strictly B2B data.
The most common use case for B2B data vendors is identifying additional contacts at target companies (62%). Marketers also looked to B2B companies to identify additional target accounts (52%) and append missing fields (50%). Only 37% were looking to B2B data vendors to cleanse their database.
The survey participants were well distributed across B2B industries with an over weighting to advertising / marketing.
Oceanos began as a list broker back in 2002, but has since evolved into a B2B contact aggregator and data refinery. The firm aggregates 97 million active US contact records and retains millions of inactive names and emails to assist with hygiene. Data is aggregated from eleven vendors and includes social data from FullContact and Pipl. Oceanos provides data enrichment, TAM analysis, net-new contacts, and a set of data specialists to assist with projects.
Oceanos is self-funded and based in Marshfield, MA. Annual revenue is around $5 million and is derived from data hygiene services, contact matching, and API-based data licensing.
Each record is assigned a data quality score based upon eleven signals including dead email addresses, drops between files, email naming conventions, and social data verification. Thus, customers and partners can employ data quality score cutoffs when licensing data. Data quality scores are also employed as part of a free Data Health Check for customers.
Contacts are mapped to 12 Job Functions, 109 Sub-functions, and 7 Job Levels. Granularity to the sub-function level assists with strategic targeting. For example, marketing is mapped to 18 job functions including brand, corporate communications, events, public relations, search engine, and social media.
The Data Health Check report does not directly validate emails and other fields, but employs the Data Quality Scores to provide an overall Data Quality Score and a data accuracy histogram. The service also provides proposed before and after fill rates across twenty biographic and firmographic variables including address fields, direct and corporate phone, employees, revenue, industry, and major social handles. The final element of the Data Health Check report is a set of segmentation charts by job level, function, specialty, domain extension, industry, sizing variables, and country.
At the end of the report, there is a data health recommendation where they contrast their Account Based Marketing approach to traditional database augmentation services:
With 0ther health checks, this is the section where they tell you that they have a plethora of contacts for you to purchase, all matching your data profile. First, the point of this analysis is not to assume that the accounts and contacts currently in your database represent the optimal mix. In many cases the results demonstrate that there is a percent of bad and misaligned data. Second, it’s important to note that this is not a quantity game. In fact the more-the-merrier mindset is the root of many database problems.
We recommend a three step process to effectively cleanse, complete, and grow your database. This is the ideal approach, but we understand that timing and budget do not always permit the perfect solution. That being the case, we suggest a conversation to review the health check results and to determine the best prescription based on your needs and goals.
The Cleanse and Complete stages purge bad data, standardize and validate data, and enrich the client’s database. Cleansing processes the file against FreshAddress, Clickback, internal tables, Whitepages, Pipl and FullContact. Only once the current data quality issues are addressed does the firm recommend a Contact Gap Analysis for populating accounts with missing strategic contacts. The analysis also identifies the percentage of contacts that match the target audience criteria (Ideal Customer Profile). Oceanos contends that best-in-class firms have at least 70% of contacts within their target audience.
The Contact Gap Analysis also provides a greenfield (net-new contact analysis) by job function and sub-function and an analysis of Total Addressable Market (TAM) coverage between House and Greenfield contacts.
Integration services include a partner API, used by ReachForce, Engagio, and Integrate, and MAP connectors for Marketo and Eloqua Cleanse and Append
Oceanos President Brian P. Hession identified their differentiators as their unique blend of technology, professional services, and data quality. With data quality being critical to ABM sales and marketing initiatives, the inclusion of real world project fulfillment through their program specialists provides Oceanos with data quality insights that are used to continuously inform and enhance the data quality processes. “We apply both technology and real-world insights to ensure the highest quality of data before we are releasing it. We are incorporating a continuous stream of data quality insights into our code to address the many nuances that a program specialist encounters manually on a dataset,” said Hession. “The way that Oceanos is going to be successful in the future is if we can assemble an internal contact database that is of the highest quality in the industry. So there’s been a lot of focus on putting models on top of our contact data.”
InsideView announced the launch of a professional services group to offer Expert Services to its customers. The new Data Concierge service “helps customers navigate the data complexities involved in strategic go-to-market initiatives.”
The Expert Services group is staffed by a team of consultants and engagement managers. “B2B sales and marketing leaders have asked for our help with operationalizing ABM and other targeted go-to-market initiatives. To meet this need, our services team has evolved from delivering data projects (like clean, email validation, contact append) to more consultative services like Target Market Analytics. In the future, we anticipate growth in this part of the business to fulfill market demand,” said VP of Product & Solutions Marketing, Joe Andrews.
The first formal service assists ABM customers with analytics around their target market and helps customers define their Total Addressable Market (TAM). The concierge service includes a data visualization console which “enables customers to more effectively select the right accounts for their account-based marketing (ABM) initiatives.”
“A picture drives a business conversation in real-time as executives can run “what-if” scenarios and make decisions in real-time,” said Andrews.
The visualization tool provides both existing and whitespace segmentation by state, country, industry, and sizing variables. Data is displayed as both raw counts and TAM penetration percentages. The product even loads in current account customer data to detail the distribution of current accounts within and outside of the ideal customer profile (ICP). Users can also drill down on segments providing a more granular view by state, size, industry, etc.
The dashboard may be downloaded as a PDF for sharing with team members or executives. Users may also download records from the Dashboard.
“We’ve always strived to be a strategic partner for our customers, and many companies underestimate the data complexities involved in go-to-market planning and execution. Now we have an expert services team dedicated to help guide customers step-by-step through the process of defining their ideal customer profile, identifying their TAM, and making sure the right targets are in their database. Our goal is for InsideView Expert Services to be a trusted partner for customers looking for help with any big, thorny data-driven initiative for marketing or sales.”
InsideView CRO John Kelly
InsideView noted that many of their customers had captured less than ten percent of their TAM in their database.
“When it’s time to pick accounts, marketing and sales can have different ideas about which should make the list,” wrote Forrester Principal Analyst Laura Ramos. “Successful marketers always lead with data to identify the characteristics that distinguish “good” opportunities from those selected through feel, anecdote, or intuition.”
While InsideView continues to offer and enhance its sales intelligence products, its emphasis over the past several years has been in launching marketing products such as InsideView Refresh (automated CRM cleansing), InsideView Enrich (real-time lead enrichment), InsideView Target (Marketing Prospecting), and a broad set of MAP and CRM connectors.
InsideView is not the first sales intelligence vendor to provide such services. Avention has long provided data and professional services and launched the DataVision platform last year for TAM analysis and white space company and contact identification. Likewise, Zoominfo shifted its emphasis from sales to marketing and launched their Growth Acceleration Platform to assist with ICP identification and net-new prospecting. While not the first entry in the space, the Expert Services represents a maturing of vendor capabilities in support of marketing departments and ABM projects. Over the past several years, the sales intelligence vendors have retrained their focus from sales insights via browsers to sales and marketing intelligence services which deliver consultative and automated data services via enterprise platforms, mobile devices, and broad connectors. This support goes beyond simply offering sales product Build a List functionality to marketing departments. It includes a set of tools and services for identifying the ideal customer profile, sizing the total addressable market, identifying white space target accounts and contacts (i.e. net-new leads), supporting web forms, automating batch and ongoing enrichment of MAPs and CRMs, prioritizing leads, and assisting with lead-to-account mapping, segmentation analysis, and campaign targeting. Other ABM features which sales intelligence vendors have begun rolling out include visitor site identification, programmatic marketing, social enrichment, Chrome connectors, ABSD (account based sales development) vendor integrations, and light predictive analytics.
We’ve helped several customers along their ABM journey, and we’ve realized their needs extend beyond company and contact data,” blogged InsideView Product Marketing Manager Jyothsna Durgadoss. “In order to be successful, they also require data visualization, technical talent, and expert guidance — all of which are critical for effectively identifying ideal targets, uncovering total addressable market, and selecting the right accounts for ABM.”
InsideView noted that many sales and marketing departments retain an ad-hoc approach to defining their ABM targets and TAM. According to InsideView, “At a recent Sirius Decisions Summit, a poll of the keynote audience revealed that more than 50 percent had an ad hoc or nonexistent approach to measuring total addressable market.” (See bar chart on left)
The best way to keep data clean is to use a globally known, unique identifier, or a “data backbone.” My company prefers to use URLs as identifiers. They’re free, globally recognizable, high-quality data points that enable you to efficiently gather information on a business’s industry, online activities, and functionality. For example, Cisco is a company that also goes by Cisco Systems, Inc. and Cisco Precision Tools. If sales containers required users to type in one unique URL, http://www.cisco.com/ for all those different branches, it’d be much more difficult to create duplicate accounts, which helps keep data clean. Perhaps more important, URLs facilitate communication between people, systems, and even departments. Whether it’s the customer relationship management platforms used by sales teams, enterprise resource planning software used by purchasing teams, or the account-based marketing technology employed by marketing teams, the business intelligence platform can recognize a unique URL and attach it to clean, usable data. Unique identifiers let you know you’re pulling from the sources and contacts you’ve intended to track.
I agree with 90% of what Fowler states, but disagree with his recommendation that URLs are the best unique identifier for his “data backbone”. There are a number of reasons that URLs fall short:
URLs are not persistent. If a company is acquired or renames itself, the old identifier (URL) is not retained. This creates a potential disconnect between the old and new name.
URLs have a many-to-one mapping which treats most subsidiary and branch locations the same as the headquarters. For some companies, mashing together all locations into a single record may be sufficient, but it is a highly flawed approach as it loses much of the nuance concerning companies that operate across multiple sectors and countries (e.g. General Electric). It also makes it very difficult for sales reps to sell deeper into an organization which lacks linkage data.
Conversely, companies with multiple URLs are not tied together. This could happen due to differing country identifiers (e.g. .UK, .FR), division names, brand names, and subsidiaries. Each of these scenarios treats companies as a separate business. Amazon has many distinct businesses including Amazon Web Services (aws.amazon.com), Zappos (www.zappos.com), Alexa Internet (www.alexa.com) Audible (www.audible.com), Internet Movie Database (www.imdb.com), and soon Whole Foods (www.wholefoods.com). URLs do not provide a consistent data backbone when subsidiaries, acquisitions, and branches have different domains.
When a division or facility is divested, there is no way to determine which locations have been spun off.
Franchises are treated as part of the parent company when they are separate legal entities.
Not all companies have websites.
URLs can be sold. They can also be reused if a company goes out of business or abandons a URL.
Finally, business decisions related to logistics, credit, supplier risk, and financing need to understand the underlying structure of companies. It is not just marketing and sales that are impacted by standardizing on a non-persistent, quasi-unique identifier.
I would therefore recommend looking at credit data companies as a better source of unique identifiers. Companies such as Dun & Bradstreet, Experian, Equifax, and Infogroup all offer location level detail and linkage associated with unique identifiers that have been developed over multiple decades. They offer sophisticated entity matching and enrichment tools such as Dun & Bradstreet’s Optimizer service. Furthermore, these firms support multiple functions across the organization helping assist with cross-platform entity linking and on-demand decisioning.