Rhetorik: What Does GDPR Mean for B2B Marketing? (Part II)

Yesterday, I presented a discussion of Legitimate Interest as the basis of GDPR communications.  For B2B companies in the UK, the 2003 PECR (The Privacy and Electronic Communications Regulations of 2003) law is often applicable when assessing GDPR and Data Privacy:

GDPR and Data Privacy under UK PECR and Non-PECR scenarios (Source: Rhetorik)
GDPR and Data Privacy under UK PECR and Non-PECR scenarios (Source: Rhetorik)

The PECR discusses soft opt-ins for individuals, sole traders and some partnerships, but not B2B.  The ICO states that “the term ‘soft opt-in’ is sometimes used to describe the rule about existing customers. The idea is that if an individual bought something from you recently, gave you their details, and did not opt out of marketing messages, they are probably happy to receive marketing from you about similar products or services even if they haven’t specifically consented. However, you must have given them a clear chance to opt out – both when you first collected their details, and in every message you send.  The soft opt-in rule means you may be able to email or text your own customers, but it does not apply to prospective customers or new contacts.”

Legitimate Interest also applies to data licensing relationships and marketing partnerships.  If personal data interest is maintained for a specific purpose (e.g. Technology Sales), data licensing and sharing needs to be kept within the original scope.

Legitimate Interest and Consent also apply within a company.  Data maintained for one product line may not be usable for others, particularly if the firm spans multiple sectors.

The UK Direct Marketing Association published guidance on the subject of Legitimate Interest helping make sense of Article 6.1.f:

“Processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.”

And Recital 47:

“The legitimate interests of a controller, including those of a controller to which the Personal Data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller.”

Once the basis of holding personal data is met, companies have additional conditions to meet around transparency (notification and the right to object), data minimization (Is there a legitimate interest in collecting all of the fields? How long is data retained?), and reasonable expectation (limited impact to personal and private life; ensuring data accuracy).

For individuals who opt out, firms must retain suppression lists to prevent the re-collection of personal information.  The suppression list should be the minimal information required to ensure the individual is not added back into the marketing database at a later date.  With B2B, the list may simply be name and email.

The GDPR also sets out expectations which are relationship specific:

  • Suspects – legitimate interest, reasonable expectation, transparency
  • Prospects – reasonable expectation; consent
  • Clients – contract, legitimate interest, reasonable expectation, data minimization, transparency

Part III of Rhetorik’s presentation discusses GDPR myths and applicable laws across Europe.


GDPR Article 6.1
GDPR Article 6.1

Rhetorik: What Does GDPR Mean for B2B Marketing?

I’ve been looking for a good description of what GDPR (General Data Protection Regulation) means to B2B marketers and finally came across a session given by UK technology profiler Rhetorik.  There have been a number of issues that have muddied the waters, making it difficult to provide much more than general rules.  Amongst the issues are a focus on the implications to consumer marketers, the lack of a general law that spans the EU, and an emphasis on rumors and fears about what will happen to firms that fail to comply with the regulation.

Rhetorik Data Protection Officer Samantha Magee noted that GDPR covers how and why companies hold and protect data.  It is focused on internal processes rather than external communications, and is channel agnostic.

In around 18 months, the EU will pass uniform ePrivacy legislation which covers external communications in member countries.  Until then, rules will remain fragmentary.  For example, Opt-in or Opt-out protocols differ by country with the UK amongst the more liberal countries:

Opt-in / Opt-out workflow by country (Source: Rhetorik)
Opt-in / Opt-out workflow by country (Source: Rhetorik)

For the moment, GDPR has given teeth to local regulations.  In the UK, the PECR (The Privacy and Electronic Communications Regulations of 2003), overseen by the Information Commissioners Office (ICO), remains the applicable regulation for consumer, single trader, and small partnership communications.  It was drafted after the European Directive 2002/58/EC, otherwise known as the or ‘e-privacy Directive’, was implemented in 2002.

There are six bases for communicating with clients and prospects, all of which have equal weight: Consent, Contract, Legal Obligation, Vital Interest, Public Task, and Legitimate Interest.  Of these, Consent (e.g. opt-in) and Legitimate Interest are the most common for B2B marketers.  Support and service departments would most likely be covered under contractual relationships.

“Legitimate Interest aims to provide a solid and lawful basis upon which commercial communication can occur, allowing marketers to promote their products and services to a targeted and well defined audience,” said Magee.  “At its heart, is the desire to ensure that commercial practices and communications are relevant to the individual, offering the assurance that high standards of care are applied and that their essential privacy” rights are considered of the utmost importance.”


Part II continues with a discussion of the UK PECR law and additional details on Legitimate Interest.

DealSignal Total Audience Platform

DealSignal, which offers an on-demand platform for Total Audience and Contact Data Management for B2B marketing and sales, recently rolled out its Total Audience Metrics (TAM) module.  The new platform helps sales and marketing professionals improve Go-to-Market and Demand Planning processes by allowing them to measure and visualize their total audience and determine coverage gaps in their CRM and MAP.  The new platform analyzes TAM by persona, account segment, and buying committees (what SiriusDecisions calls Demand Units).

“We’ve run hundreds of TAM analyses for B2B marketing teams in various industries and customers are consistently surprised to find that they’re missing more than 80 percent of their target audience—the contacts that fit their target personas and ideal customer profile. TAM coverage is currently averaging 18 percent in existing CRM and MAP systems. It’s a big ‘aha moment’ to learn that you’re missing out on marketing or selling to a large majority of your potential buyers. Often, the best potential buyers – those most likely to convert – are among the missing contacts found in the gap analysis,”

  • DealSignal CEO Rob Weedn

The firm is seeing rapid uptake on its TAM service which is available as either a freemium (TAM Estimates) or paid option (TAM Actuals).  “Early feedback is that this is a great way to verify the counts and size up the Outbound and/or ABM marketing programs over the upcoming year,” said Weedn.

According to DealSignal, TAM Estimates are accurate to ± 20% of Accounts and Contacts.  “We’ve been offering this for a few months and it is very popular” with customers and prospects “leveraging this analysis for initial demand planning and budgeting,” said Weedn.  “TAM Actuals is a Paid Offering, charged based on credits on our platform, which provides perfectly accurate Total Audience metrics based on Accounts and Contacts.”

The DealSignal platform dynamically discovers, refreshes, and verifies records based on the TAM criteria.

DealSignal has adopted the term TAM, but calls it Total Audience Metrics instead of Total Addressable Market.  Weedn explained the difference between the DealSignal and Classic TAM approach:

Total Addressable market is classic and static top down analysis, based on sample/partial market data, typically performed by market research and analyst firms like IDC, Gartner, etc.  “Classic TAM” is not necessarily an accurate sizing of the market, it is not frequently updated, and, most importantly, there is no real way for marketing and sales teams to plan marketing and sales programs with a classic and static top-down TAM, and definitely no way to execute against the Accounts and Contacts in that TAM.

DealSignal, is here to help marketers market and sellers sell, so we perform an accurate, bottoms-up, dynamic analysis, based on complete market data, of the actual counts of the Total Audience – which we define as the Accounts that meet Target Market criteria (Industry, Employee, Revenue, Technologies Used, etc.) and Contacts that meet Ideal Buyer Persona criteria.  Further, our Total Audience Metrics/Measurements include a process to dynamically discover and verify the underlying Accounts and Contacts, so TAM Analysis is dynamic, based on actuals, and can be updated on demand.  The Accounts and Contacts can then be converted, with one click, to fully enriched and verified with full Account/Contact Profiles and Contact Information to be used in marketing and selling initiatives.

Using the DealSignal platform, users can define target personas and Ideal Customer Profiles (ICPs) to build out their TAMs, using micro-targeting criteria such as Titles, Profile Keywords, and Locations that yield results as ranked lists of relevant accounts and contacts. The module compares the TAM against the CRM and identifies gaps by account, industry, geography, etc.  DealSignal provides the TAM based not only on CRM data and large third-party sources, but through dynamic sourcing and verification, so the TAM results are “comprehensive and accurate” with net-new accounts and contacts.

DealSignal combines APIs, algorithms, and human intelligence to achieve a much higher level of contact accuracy (95 – 100% according to the firm) than most vendors.  The company provides a 100% guarantee on all Account and Contact data.  The system enriches and verifies existing leads, contacts and accounts.  As it conducts dynamic data sourcing, DealSignal claims account enrichment match rates between 95 and 100% and lead enrichment match rates between 85 and 100%.

DealSignal TAM Analysis Module
DealSignal TAM Analysis Module

DealSignal dynamically discovers, enriches and verifies account and contact lists through a combination of AI robots and researchers combined with CRM and MAP feedback loops.  The firm claims a deliverability rate between 94 and 97% and reverifies data on demand for every customer request, with a two week window for contact aging.  Records that fall outside of the two-week window are reverified overnight.

“Since static data-at-rest quickly becomes dated, we do not trust it, you should not trust it, and you should certainly not rely on it to define or optimize your vital marketing or sales programs. It must be renewed and refined at runtime,” said Weedn.  “We believe in dynamically refreshing and re-verifying data on-demand, when it needs to become active and put into a marketing or sales process—and we’ve uniquely designed the DealSignal platform to do just that.”

DealSignal has automated and editorial processes that place its data quality at a level claimed only by DiscoverOrg.  Both firms utilize editorial teams for staying ahead of the 25 to 30% contact decay rate suffered by static databases.  DiscoverOrg performs a full data verification every 90 days while DealSignal performs a just-in-time data quality review overnight.

“Marketers and sales teams currently rely on solutions that provide 50 to 80% quality.  That is a B- or F on a test, and we need to change the expectation to impeccable quality, at 95-100% (A or A+) to greatly improve marketing and sales performance,” said Weedn.

Last month, DealSignal released a GDPR risk assessment module which enriches CRM data with contact locations and flags EU-based leads.  Users can also choose to exclude EU-based leads.

“B2B marketers are faced with many challenges today: identify and engage their total audience, try to keep their audience data fresh and accurate, and comply with new regulations like GDPR. Given the negative consequences associated with GDPR, most marketers are scrambling to review and re-verify the location and status of their contacts,” said Weedn.

Leads are pre-purchased on a volume basis with 1,000 credits running $895.  Volume discounts kick in at 5, 10, 25, 50 and 100 thousand credits.

D&B Optimizer for Marketing

DNB Optimizer for Marketing -- Key Features
DNB Optimizer for Marketing — Key Features

Dun & Bradstreet rebranded D&B Workbench Data Optimizer as D&B Optimizer for Marketing and announced a set of enhancements to the platform.  The Workbench name, now dropped, went back to the product’s origins as NetProspex Workbench, one of the first DaaS Hygiene / Enrichment / Prospecting platforms.  The rebranded product includes a series of new features including an Analyze module, Salesforce Contact Optimization, custom email deliverability targets, and NAICS industry code support.

“This new name reflects Dun & Bradstreet’s commitment to deliver the very best in data optimization services,” the firm wrote its clients.  The new name is also consistent with its other Optimizer solutions: D&B Optimizer for Salesforce and D&B Optimizer for Microsoft.

The new Analyze module delivers profiling and market opportunity analysis “utilizing D&B Master Data and proprietary machine-made analytics.”  Features include dynamic dashboards which help marketers visualize their primary profile by revenue, employee size, and industry.  The service also provides look-a-like opportunities to assist with ABM expansion and pipeline growth.

The new Salesforce Integration for Contact optimization supports contact cleansing and enrichment at a frequency determined by the customer.  Dun & Bradstreet claims that the Salesforce integration may be setup in fewer than twenty minutes.

Custom Email Deliverability Levels allow marketers to dip deeper into Dun & Bradstreet’s pool of emails and select contacts with lower reliability scores.  The default level is 90% deliverability, but highly targeted selects may require using contacts that are below the 90% deliverability threshold.  Dun & Bradstreet called the 90% threshold “our recommended level for most email campaigns.”

Finally, D&B Optimizer for Marketing added NAICS industry code selects.  The product already supports the older US SIC industry taxonomy.

Other D&B Optimizer for Marketing features include data validation and standardization (email, phone, address), duplicate flagging, data hygiene reports, lead prospecting, segmentation analysis, and data enrichment (firmographics, D-U-N-S Numbers, corporate linkages, technographics, biographics).

Holistic Data Quality

A contact database that is 90% accurate is subject to a 25% decay rate.; thus, after 9 quarters it is less than 50% accurate.
Contact databases are subject to a 25% decay rate; thus, a contacts database that is 90% accurate today will be 70% accurate a year from now and less than 50% accurate after nine quarters.

Poor data quality is a disease which slowly destroys the value of your marketing database.  Quality is damaged through incomplete information, poor data entry, and data decay.  A traditional response is to purchase new records, but this only provides a temporary (and expensive) respite from your data quality issues.

The data I’ve seen indicates that contacts decay at a 25 to 30 percent annual rate.  This means that a prospect list that is 90 percent accurate today will be little more than 50% accurate two years later.  Thus, a prospect list purchase strategy is like steroids, it makes your marketing database look healthier on the day the list is purchased, but it simply masks the growing disease within your database.  Treating one or two symptoms does not address the underlying problem — a lack of a broad, continuous data strategy.

However, if you take a holistic view around data quality which includes continuous DaaS validation, ABM look-a-likes, web form enrichment, lead-to-account mapping, duplicate management, data standardization, and reference database appends, you will have a healthy database that ensures your MAP and CRM platforms contain the richest, most accurate data.

Vendors that support holistic data quality include ReachForce, D&B Optimizer (FKA Workbench), Zoominfo, InsideView, Oceanos, and Openprise.  So if you are concerned about your ability to target, segment, pass quality leads to sales, score leads, or build predictive models, then begin with a holistic data strategy.  Symptoms of poor data quality include high email bounce rates, declining email sender scores, returned direct mail, duplicate records, incomplete records, accelerating unsubscribe rates, and sales reps that ignore your marketing qualified leads.

Any firm that is adopting ABM, advanced lead scoring, a single view of the customer, or predictive analytics, should begin with a holistic data quality strategy.  Otherwise, these advanced marketing strategies are bound to fail.

The Fog of Corporate Battle

Churchill's WWII Map Room (Source: Creative Commons)
Churchill’s WWII Map Room (Source: Kaihsu Tai / Creative Commons)

Information is a key asset on the battlefield which provides a competitive advantage to the side with better information and communication systems.  While the “Fog of War” continues to be an issue, real-time information sharing helps improve military decision making and reduces the risk of both collateral damage and friendly fire accidents.  Nevertheless, information remains imperfect and mistakes continue to happen.

In the corporate world, there is also a fog of corporate battle, but much of it is self-induced.  We build systems that don’t talk to each other or which use different conventions for standardizing information and identifying customers and contacts.  Furthermore, information is not validated and enriched as it is obtained, resulting in weak information sets.

While this lack of data synchronization creates headaches across the company, I will be focusing on sales and marketing platforms for purposes of brevity.  Inaccurate and incomplete marketing information causes problems within marketing platforms such as weak segmentation, poor scoring, bad targeting, and misallocated marketing resources.  Bad and missing fields are then propagated to downstream systems.  If information is bad when received and there are no mechanisms for validating, standardizing, and enriching the information in its system of origin, misinformation flows to other platforms resulting in an increasingly expensive set of problems and remediation costs.  It is much easier and less expensive to resolve a problem at its source.

Furthermore, once leads are enriched with firmographic and biographical details, the intelligence is available to downstream platforms; and if the enrichment includes a company identifier (e.g. European Registration Number, Ticker, D-U-N-S Number), then maintaining data accuracy in downstream systems and linking the platforms is much easier.

The cost of islands of information is high for B2B firms.  A few examples:

  • Marketing departments generate a broad set of leads through multiple channels and systems.  Some of this information is anonymous and some is tied directly to contacts.  How confident are you that you aren’t generating duplicate (or triplicate or quadruplicate…) information?  Are you matching and enriching information as it is gathered from web forms, uploaded tradeshow spreadsheets, and purchased lists, or are you loading data “as is” with little verification or enhancement?  By focusing on data quality at the outset, you are ensuring that richer and more accurate information is shared across your platforms.
  • Marketing invests large sums in generating marketing qualified leads (MQL) which are then frequently ignored or cherry picked by sales.  Some of this disconnect is a lack of agreement on what constitutes a good lead but some is also a lack of front-end intelligence being applied by marketing.  A lead may be considered as marketing qualified but lack key information to pass muster with sales (e.g. how big is the company? What industry are they in? What is the job function and level of the contact? What technologies do they use? Will they be approved by credit after I’ve invested months in landing the deal?)  Knowing that a lead downloaded a whitepaper earlier in the day signifies interest in a topic, but not the ability or authority to make purchasing decisions.  Furthermore, it provides a thin reed upon which to base a sales conversation.
  • Channel conflicts are introduced when bad or missing information results in a lead being directed to the wrong sales rep.  Leads which lack accurate firmographics and linkage information are likely to be routed to the wrong team or rep.  Thus, a lead generated at a subsidiary or branch location of a major firm may be routed to a territory rep instead of a named account rep, resulting in both channel conflict and a greater likelihood that the lead will be ignored.  Of course, if the lead was poorly routed, it also is likely that the lead was improperly scored and assigned to the wrong segments for targeting and analytics.

Finally, the lack of standards and cross-platform communication make it difficult to obtain a unified view of the customer.  An October 2015 survey of global executives by Forbes Insights found that 63% believed that a more complete/unified view of the customer would result in more accurate predictions of customer needs and desires.  Other benefits included improved customer experience/service (60%), greater feedback for product/service innovation (55%), and a greater ability to target and optimize for specific customers (50%).

For decades, technology strategists have warned about the problems of creating data islands across one’s IT platforms.  If systems are unable to speak with each other or data lacks consistency across systems, then it is impossible to develop a holistic view of one’s business and customers.  And while the problem seems large today, it will only grow in scope with the advent of the Internet of Things.  So if you think the fog of corporate battle is difficult in 2017, failing to address it will only make the problem many-fold more difficult to tackle in the years to come.

Data Quality in 2018

NetProspex WorkBench Value Proposition
D&B Optimizer (FKA Workbench) Value Proposition

 

As we are one month away from the new year, it is a good time to think about budgeting for data quality in 2018.

I know it isn’t glamorous, but that doesn’t mean it is unnecessary.

Data Quality software is markedly improved over the past few years.  No longer is it necessary to download and forward a file to a vendor and wait for them to process your marketing file.  Sales and Marketing Operations can now setup automated cloud cleansing that works within Marketo, Eloqua, Salesforce, Microsoft Dynamics, and other enterprise applications.  B2B vendors to consider include Dun & Bradstreet, InsideView, Zoominfo, and ReachForce.

These platforms perform both initial batch match & append and ongoing enrichment, ensuring that your sales and marketing files have both accurate and complete data.  These services also support company and contact prospecting, data health reports, suppression lists, and segmentation reporting.  A few even offer free data quality reports, deduplication, technographic enrichment, nixie files (defunct companies and departed exec files), web form support, sales intelligence services, and contact verification and standardization (e.g. address, phone, and email) for non-matched records.

As these services reside in the cloud and offer cloud connectors for the major MAPs and CRMs, the operational overhead is minimal allowing operations to focus on ABM look-a-likes, segmentation, and improved targeting instead of file management.

What’s more, data quality improvements benefit sales, marketing, and downstream systems.  A record cleansed and verified as it is created costs much less than a bad record passed down to other enterprise platforms.  Beyond direct cost reduction (storing bad data, marketing to departed execs, sales calls to abandoned voicemails, reduced time keying and updating records manually), there are improvements to segmentation, targeting, lead scoring, lead routing, and messaging.

So budget for data quality in 2018.  It isn’t glamorous, but it is effective.

The ReachForce Contact Enrichment Summary Report.
The ReachForce Contact Enrichment Summary Report.