The California Consumer Privacy Act (CCPA) went into force this week, but enforcement will be delayed for six months. “We’re going to help folks understand our interpretation of the law,” said California Attorney General Xavier Becerra. “And once we’ve done those things, our job is to make sure there’s compliance, so we’ll enforce.”
“CCPA marks an important step toward providing people with more robust control over their data in the United States,” wrote Microsoft’s Chief Privacy Officer Julie Brill. “It also shows that we can make progress to strengthen privacy protections in this country at the state level even when Congress can’t or won’t act.”
CCPA requires firms to be transparent in how they collect and use consumer data. Individuals also have the option to block sales of personal data. However, “Exactly what will be required under CCPA to accomplish these goals is still developing,” wrote Brill.
Microsoft supports a national privacy law which cover “more robust accountability requirements” including minimizing data collection, transparency around how data is being used, and “making them more responsible for analyzing and improving data systems to ensure that they use personal data appropriately.”
Facebook is hedging, saying “we do not sell people’s data” without acknowledging that its business is based on monetizing member data and that it has a poor history of controlling partner data collection on its platform.
Salesforce CEO Marc Benioff called Facebook the “new cigarettes for our society,” which undermines societal trust. On CNN’s Reliable Sources, Benioff called for Facebook to be regulated or split up. “They’re certainly not exactly about truth in advertising. Even they have said that. That’s why we’re really in squarely a crisis of trust, when the core vendor themselves cannot say that trust is our most important value. Look, we’re at a moment in time where each one of us in every company has to ask a question: What is our highest value?”
“I expect a fundamental reconceptualization of what Facebook’s role is in the world,” continued Benioff. “When you have an entity that large with that much potential impact, and not fundamentally doing good things to improve the state of the world, well, then I think everyone is going to have it in its crosshairs.”
In a blog titled, “Maintaining the Trust of our Members,” LinkedIn recommitted itself to a members-first approach. The Microsoft subsidiary frames its decision-making with the question, “Is this the right thing to do for our members?”
Along with a members-first policy, LinkedIn employs four principles to frame decisions:
Members maintain clarity, consistency, and control over their data. This goal is manifested in a broad set of privacy settings, observing the stated wishes of each member, and protecting their data. Microsoft employs a global GDPR standard and does not transfer member data to other companies. For example, LinkedIn Sales Navigator limits data access to member-data view-only access, which displays profiles within CRMs and other partner applications but does not transfer data to those platforms.
LinkedIn will remain a safe, trusted, and professional platform. The firm removes content which violates their Professional Community Policies and removes fake profiles, jobs, and companies.
LinkedIn is committed to removing unfair bias from its platform so that individuals with equal talent have equal access to opportunity. “To achieve this goal, we are committed to building a product with no unfair bias that provides opportunity to all of our members. There is a lot of work still to do, but we are focused on working across our company, with our members and customers, and across the industry to close the network gap.”
As a global platform, they are committed to respecting the laws that apply to them and “contributing to the dialogue” about legal frameworks.
LinkedIn Advertising is subject to an initial review. LinkedIn vets ads to ensure they are non-discriminatory:
“Even if legal in the applicable jurisdiction, LinkedIn does not allow ads that advocate, promote, or contain discriminatory hiring practices or denial of education, housing, or economic opportunity based on age, gender, religion, ethnicity, race, or sexual preference. Ads that promote the denial or restriction of fair and equal access to education, housing, or credit or career opportunities are prohibited.”
Blake Lawit, LinkedIn General Counsel
The statement of principles comes at a time when other social media firms are struggling to develop rules and policies around political advertising. LinkedIn does not carry political advertising and also restricts adult content, illegal, health, gaming, weapons, multi-level marketing, alcohol, tobacco, and financial (payday loans, cryptocurrency) products.
LinkedIn continues to grow its customer base with 660 million members across 200 countries and 30 million companies. The top countries are the United States (165M members), India (62M), China (48M), Brazil (40M), and the UK (27M).
LinkedIn maintains offices in nine US cities and 24 international locations. The platform supports 24 languages.
LinkedIn CEO Jeff Weiner raised concern about a tide of tech regulation following recent data privacy scandals. Of particular concern is the impact of removing tech company immunity for the content shared by users under Section 230 of the Communications Decency Act. If the Section were removed, social networks would be forced to proactively censor posts.
if the [technology] industry were to do greater self-regulation, you’re going
to see more regulatory oversight.”
as the wide use of algorithms has provided a megaphone to misinformation and
fringe social media, regulation can have unintended consequences. “The
unintended consequences work both ways,” said Weiner. “Companies make
decisions only with the best of intentions, and there are unintended
consequences of those decisions. But from a regulatory perspective, I
think it’s the same thing.”
“You could stifle a lot of innovation. You could stifle a lot of openness. You could stifle a lot of the things that create value by virtue of changing these liability rules and laws. That is just almost a canonical example of where these unintended consequences would really proliferate. The things companies would need to do to ensure that they were protected is going to hurt the way in which people can communicate with one another.”
LinkedIn CEO Jeff Weiner
LinkedIn operates in China where it is subject to censorship. The firm decided to enter the market as it’s mission is to create economic opportunity globally. “The censorship issue in China is always a painful one,” he said. “It has to be navigated and managed in the context of the broader vision.” While LinkedIn is advocating for Section 230, its parent company has taken a pro-regulatory view on data privacy, calling for an American version of GDPR. Microsoft has built GDPR into the infrastructure of its platforms.
Artesian Solutions, the UK Sales Intelligence vendor, has been teasing its Artesian Risk and Compliance Hub (ARCH) compliance service for over a year. The new offering, now Generally Available, “enables relationship managers, underwriters and frontline teams within banks, insurance companies, and other financially regulated industries to quickly assess and better understand their corporate clients at the start of the customer journey and throughout the life of the customer.”
Financial services generally perform KYC / AML (Know Your Customer / Anti-Money Laundering) processing during onboarding, but ARCH moves initial processing to frontline staff at the top of the sales funnel before a client is signed. This “distributed compliance,” helps expedite the process, sets client expectations when processing may take longer than normal, and allows relationship managers to avoid prospects that will have arduous compliance processing or which may not meet the institution’s “appetite.”
flags risks which may require additional information from the client. By
flagging them at the outset, the RM can request the missing data before it delays
ARCH performs event-driven reviews which begin before onboarding and continue through the life of the loan or policy. Thus, KYC is no longer subject to periodic reviews but is performed dynamically as new information about the client is ingested by ARCH. Instead of client reviews determined by the calendar, events can trigger full client reviews as needed.
ARCH supports commercial insurance policy writing “with a combination of data and sophisticated rules, bringing efficiency, consistency, and accuracy so that underwriters can focus on underwriting. Decisions can be recorded whilst both justified in the future and used for decision analysis and pricing optimisation.” Artesian’s fine-grained taxonomy and assisted machine learning help to identify potential underwriting risks “according to the predetermined definitions of an insurer.”
moving compliance reviews to front-line workers, commercial insurers can
perform a KYC check and risk evaluation prior to quoting a policy. “One
reason for using it is that they might want to look at what gets declared to
them by the new customer compared to what they can see from ARCH,” said
Artesian VP of Risk Solutions Matt Elsom. “To do that they can have a
look at some of the fraud-focused data sources and financial data.”
survey by Fenergo of global financial services executives found that poor
onboarding negatively impacts client experience and reduces the lifetime value
(LTV) of clients. 36% acknowledged losing customers due to onboarding
issues and 84% tied the onboarding experience to reduced LTV.
Figure 2: “The Cost of Poor CX,” Fenergo, January 2019. N=250 global Financial Services executives (Source: Artesian Solutions)
noted that KYC compliance team workloads have “grown beyond all expectations”
due to the availability of international ownership linkages and ultimate
beneficial ownership data. “The overall effect of this is an MLRO [Money
Laundering Reporting Officer] and board being put under pressure to reduce
onboarding delays whilst maintaining adherence to regulation – and the only
effective solution has been to recruit more compliance analysts. The cost
associated with this approach has become unsustainable as the work queue
continues to grow simply to maintain current levels of new business.”
to Artesian, “ARCH is not only an innovative new technology, but a huge leap
forward in the drive for ‘distributed compliance’ – the ability for central
teams to distribute KYC and AML tasks to their frontline colleagues who are
best placed to engage with the client and solve issues in the fastest, most
productive way. It places compliance and powerful risk data at the heart
of the business – front of mind for every member of staff, informing every
decision, instructing every interaction and shaping every relationship from
pre-screening prospective new customers through to ongoing tracking and
long-standing client development.”
“configurable decision engine” monitors real-time credit risk and KYC data sets
and applies bank or insurer policies to the compliance decisions. Each
client determines which data sources to ingest and “applies custom policies to
that combined data in the form of multi-dimensional rules” which are screened
and interpreted based upon institutional policies. Flagged issues are
delivered through a browser interface or loaded into other compliance systems
via an API.
“We have the great privilege of serving 80% of the UK’s major banking institutions, providing powerful sales engagement insights to relationship managers. We asked what we could do to make our software even more useful and the answer was ARCH. Almost two years of engineering and millions of pounds later we’re announcing ARCH’s general availability for customers. We believe this puts Artesian in a unique position to be able to combine customer engagement capabilities together with credit and risk in one single application delivered through a browser or mobile device.”
We’ve built a strong team of specialists to extend our core competencies and have worked closely with our key partners at Experian, LexisNexis, and Refinitiv (Thompson Reuters) with more partnerships to come. This allows our customers to select the data sources they already rely upon and trust and easily integrate them into ARCH”
Artesian Solutions CEO Andrew Yates
a beta test with a top UK bank, ARCH decisioning was fully consistent with
existing bank processing while flagging 14% more “critical risk” issues than
current bank processes. ARCH also reduced average case time from two
hundred minutes to eight, “allowing relationship managers to know more, know
sooner and save time – enabling them to focus on delivering a better customer
recently added ARCH to its eleven-week incubator program Scale InsureTech which
is “aimed at identifying and developing fast-growth technology companies in the
financial services clients include RBS, Barclays, HSBC, Lloyds Bank, EY, and
North American sales intelligence firms do not normally support client onboarding and risk assessment, but UK and European firms support these functions due to a richer set of registry data. European vendors such as DueDil, Bureau van Dijk, and Artesian support sales, marketing, and regulatory compliance.
As GDPR hit its first anniversary on Saturday, Microsoft once again called for a US privacy law which shifts the onus of data privacy from the individual to corporations. Today, Americans operate in an opt-out regime which requires them to find and manage their privacy settings.
places an unreasonable — and unworkable — burden on individuals,” wrote
Microsoft’s Deputy General Counsel Julie Brill. “Strong federal
privacy should not only empower consumers to control their data, it also should
place accountability obligations on the companies that collect and use
sensitive personal information.”
Microsoft prefers a single federal standard to piecemeal state-level laws such as California’s CCPA. Brill said the legislation should be interoperable with the GDPR to help reduce the “cost and complexity of compliance.” This framework should reflect ”the changing understanding of the right to privacy in the United States and around the world.” The proposed legislation should “uphold the fundamental right to privacy through rules that give people control over their data and require greater accountability and transparency in how companies use the personal information they collect.”
American businesses, interoperability between U.S. law and GDPR will reduce the
cost and complexity of compliance by ensuring that companies don’t have to
build separate systems to meet differing—and even conflicting requirements—for
privacy protection in the countries where they do business,” said Brill.
eMarketer analyst Ross Benes, the US ad industry has shifted from a call for
self-regulation to supporting national privacy regulations, fearing ”a
patchwork of different rules” as “legislation looks increasingly inevitable.”
A TrustArc/Ipsos survey of UK adults (16 – 75) found a 36% improvement in trust concerning personal data since GDPR went into effect.
A Snow study found that 39% of global business professionals believe their data is better protected since GDPR passed, with the biggest increase in the APAC region (48%). 40% of Europeans also believed their personally identifiable information is more secure, but only 30% in the US held the same belief.
74% of surveyed professionals believe that the technology industry needs more regulation with 83% of APAC and 72% of US respondents wanting additional tech regulation.
The EU has yet to strictly enforce the law with only one large fine ($56M) versus Google in France. However, Google and the social media and advertising companies are all subject to ongoing suits:
The latest investigation — the first by the Irish watchdog into Google — brings to 19 the number of open cases by the regulator targeting big U.S. tech companies. They include probes into Apple Inc., Twitter Inc., eight probes into Facebook Inc., plus one into Instagram and two into WhatsApp.
Los Angeles Times, “Google could face hefty EU fine over possible privacy violations,” May 22, 2019
important to recognize is that the EU is taking GDPR very seriously, with fines
being established for any breach,” said Ben Feldman, SVP of strategy and
innovation at NYIAX. “I would expect that the first six-to-nine months of
any new regulation action would be spent working out the kinks and processes of
implementation. It is quite likely that we will see more fines in the
The following is a Quora post answering the question, “Does LinkedIn Sell Your Info?”
This is likely to fall into a semantics question. If data is employed in the aggregate and your personally identifiable information is not disclosed, then I would argue that your information is not sold. Likewise, if you are presented an ad because your LinkedIn profile conforms with a target audience definition, your data is also not being sold.
I can’t answer for LinkedIn Recruiter, but can answer in the Sales and Marketing context.
LinkedIn offers a sales product called Sales Navigator. Users can view company and contact information on Navigator just as they can on the free service. It even supports viewing this data within third-party SNAP products. However, Navigator and SNAP are view only. Sales reps cannot download your profile or sync it with any of their partner platforms. They also restrict display of your email and phone information to your direct connects as well as other content you flag as restricted.
LinkedIn Marketing sells advertising on LinkedIn and Bing based upon your profile attributes. Advertisers define their target audience across a broad set of firmographic, career, and location variables, but these segments are not provided directly to the marketer. Instead, they are used for advertising display. Thus, your data isn’t sold, just your eyeballs.
LinkedIn treats its member’s data with respect. Microsoft, its parent company, has called for a US version of GDPR, the European data privacy standard. CEO Satya Nadella stated that “privacy is a fundamental human right” on an April 2018 earnings call and said that the firm has implemented an “end-to-end privacy architecture” which is GDPR compliant.
Artesian Solutions CEO Andrew Yates published a year-in-review blog and a preview of their upcoming Artesian Risk and Compliance Hub (ARCH). The new ARCH capabilities will extend their social selling platform into Know Your Client (KYC) reviews at UK banks. ARCH is in early testing.
ARCH leverages Artesian capabilities around interpreting structured and unstructured data ”to create useful flags and to drive appropriate actions.” Artesian already is on the desktop of relationship managers (RMs) at most of the major UK banks. “This puts us in a unique position to make insights regarding financial and KYC risks available to the front-line as a pre-screen, to ensure that corporate banking relationships begin with an appropriate understanding of risk.”
supports an automated audit trail and storage of evidence. Early tests
found ARCH to be “100% accurate in reflecting policy in pre-screening.” Arch
also reduced the time spent in gathering risk assessment data by 90% and
identified 14% more risk issues compared with manual processing.
a pre-screen at the front-end of client discussions, RMs can focus on new
clients that will pass muster during the onboarding review process. This
process makes both relationship managers and compliance professionals more
effective. RMs will no longer be spending time with prospective clients
that won’t pass compliance review while compliance professionals can focus
their attention on more complex reviews which require their skill and
“ARCH gives companies control of a sophisticated decision engine to enable data being accessed to have rules applied and flags created. It means that Relationship Managers can see a summarised view of what their central risk teams assessment of a potential client would be, before spending time and money engaging with them. The automation aspect of this is fundamental as it brings efficiency, consistency and control to the areas it transforms.
But more than that, it places compliance at the heart of the business – front of mind for every member of staff, informing every decision, instructing every interaction and shaping every relationship from pre-screens for new customer prospecting through to long-standing client development.”
Artesian CEO Andrew Yates
McKinsey research which notes that the risk function at financial institutions
is being transformed “with the detection, assessment, and mitigation of risk” being
transferred to all employees by 2025.
Risk and Compliance tools are a greater focus amongst European sales intelligence firms due to the availability of private company registry data. While US private companies provide only minimalist filings with Secretaries of State offices (with a few exceptions in insurance, banking, and nonprofits), UK company registration data includes directors, shareholders, and financials. Other UK compliance data includes sanctions lists, Politically Exposed Persons (global government officials and relatives), disqualified directors, gazettes (shuttered business and those in receivership), and traditional credit reports. Vendors such as Artesian, DueDil, and Bureau van Dijk have recently emphasized compliance and risk tool development over sales intelligence offerings.
reached 30,000 users in 2018 with their user base tracking over 800,000
companies. According to Yates, Artesian customers “have received 12.5
million actionable insights, 2.5m unique computational matches each week,
automated the equivalent of 2 trillion Google searches per week (13bn per
hour), and have made 523,813 useful connections using Artesian data.”
staff provided over 350 training sessions, webinars, and workshops to more than
3,000 users in 2018. Artesian Academy delivered an additional 1,200
multi-media tutorials, certification modules, role-based tips, and social media
best practices overviews.
While the change is pro-privacy and consistent with GDPR, TechCrunch took a negative view of the new setting.
A win for privacy on LinkedIn could be a big loss for businesses, recruiters and anyone else expecting to be able to export the email addresses of their connections.…[The new option] could prevent some spam, and protect users who didn’t realize anyone who they’re connected to could download their email address into a giant spreadsheet. But the launch of this new setting without warning or even a formal announcement could piss off users who’d invested tons of time into the professional networking site in hopes of contacting their connections outside of it…
On a social network like Facebook, barring email exports makes more sense. But on LinkedIn’s professional network, where people are purposefully connecting with those they don’t know, and where exporting has always been allowed, making the change silently seems surreptitious. Perhaps LinkedIn didn’t want to bring attention to the fact it was allowing your email address to be slurped up by anyone you’re connected with, given the current media climate of intense scrutiny regarding privacy in social tech. But trying to hide a change that’s massively impactful to businesses that rely on LinkedIn could erode the trust of its core users.
TechCrunch overstates the loss. Member control their data, not LinkedIn or LinkedIn connections. Second, there are multiple ways to reach users from within LinkedIn including InMail, messaging, and PointDrive. Unless the email is blocked on the profile, connections still have access to emails from within LinkedIn. Finally, most emails in LinkedIn are personal emails, not business emails (an issue they should address by allowing both and setting privacy and messaging rules around multiple emails), so reaching out to individuals on their emails only makes sense for friends, family, and recruiters on LinkedIn, not businesspeople networking with colleagues and clients.
While LinkedIn wasn’t transparent about the privacy change, it enhanced the privacy of its members. As such, looking for nefarious reasons for the enhancement is a reach.
Speaking at the 40th International Conference of Data Protection and Privacy Commissioners (ICDPPC), Apple CEO Tim Cook forcefully called for expanded global privacy protections akin to GDPR:
Our own information — from the everyday to the deeply personal — is being weaponized against us with military efficiency. These scraps of data, each one harmless enough on its own, are carefully assembled, synthesized, traded and sold. Taken to the extreme this process creates an enduring digital profile and lets companies know you better than you may know yourself. Your profile is a bunch of algorithms that serve up increasingly extreme content, pounding our harmless preferences into harm…
We shouldn’t sugarcoat the consequences. This is surveillance…
We should celebrate the transformative work of the European institutions tasked with the successful implementation of the GDPR. We also celebrate the new steps taken, not only here in Europe but around the world — in Singapore, Japan, Brazil, New Zealand. In many more nations regulators are asking tough questions — and crafting effective reform.
It is time for the rest of the world, including my home country, to follow your lead.
We see vividly, painfully how technology can harm, rather than help. [Some platforms] magnify our worst human tendencies… deepen divisions, incite violence and even undermine our shared sense or what is true or false.
This crisis is real. Those of us who believe in technology’s potential for good must not shrink from this moment…
They may say to you our companies can never achieve technology’s true potential if there were strengthened privacy regulations. But this notion isn’t just wrong it is destructive — technology’s potential is and always must be rooted in the faith people have in it. In the optimism and the creativity that stirs the hearts of individuals. In its promise and capacity to make the world a better place.
It’s time to face facts. We will never achieve technology’s true potential without the full faith and confidence of the people who use it.
He also warned about the dangers of AI which fails to protect privacy:
Artificial intelligence is one area I think a lot about. At its core this technology promises to learn from people individually to benefit us all. But advancing AI by collecting huge personal profiles is laziness, not efficiency.
For artificial intelligence to be truly smart it must respect human values — including privacy. If we get this wrong, the dangers are profound. We can achieve both great artificial intelligence and great privacy standards. It is not only a possibility — it is a responsibility…
Yesterday, Cook tweeted that privacy is a human right
based upon four principals:
Data Minimization – Personal data collection should be minimized or de-identified.
Transparency – Individuals have the right to know what is being collected and for what purpose.
Right to Access – “data belongs to users” with personal data available to individuals for copying, correcting, and deleting.
Right to security – “security is foundational to trust and all other privacy rights”
One of the concerns raised by GDPR is fear of draconian fines, but that should not be a concern in the UK, at least for those who act in good faith. “I have no intention of changing our proportionate and pragmatic approach, said ICO Information Commissioner Liz Denham. “Hefty fines will be reserved for those organisations that persistently, deliberately, or negligently flout the law.”
And while many have complained that GDPR is a major hindrance to traditional marketing, it redirects efforts towards better targeted accounts and prospects. “B2B direct marketing is alive and well, and is explicitly envisaged in the GDPR legislation,” said Kevin Savage, Rhetorik’s Chief Revenue Officer. “You can do B2B marketing, and you should because compliance requirements are really a blessing in disguise. Relying on Legitimate Interest requires you to be more mindful and selective about the personal data you keep and use. This selectivity enables you to be more targeted in your messaging, to cut through the noise and engage prospects more effectively.”
Please find the underlying statutes for major European countries, courtesy of Rhetorik: