Here’s How to Stop Facebook Tracking


Nearly 75% of users did not know about the social-media
company’s detailed ad-preferences page


Roughly half of Facebook users in a new Pew Research Center survey said they’re given a political label. Facebook has promised more transparency about ads on its platform, but the majority of users are still in the dark about the kind of information that’s been collected on them.



That’s according to a study released Wednesday by the Pew
Research Center, a Washington, D.C.-based think tank. The vast majority of
users surveyed (74%) said they were not aware that Facebook FB, -1.06%  lists their interests for advertisers and
that these interests can be found in the “ad preferences” page on user
profiles. Those preferences run the gamut from pop culture, consumer purchases
and “likes” to “multicultural affinity” and political labels.


More than half (51%) of users said they were not comfortable
with Facebook making such a list.


More than half of users said they were not comfortable with Facebook making such a list about their preferences, the Pew survey concluded.


One in five Facebook users (21%) report they are listed as having “multicultural affinity,” the Pew Research survey found. Of those, 43% were assigned an affinity to African American culture and 43% assigned  Hispanic culture, and 10% were assigned an affinity with Asian American
culture.

“Facebook’s detailed targeting tool for ads does not offer affinity classifications for any other cultures in the U.S., including Caucasian or white culture,” Pew researchers said in the report.

Roughly half (51%) of those in this survey are given a political label. Some 73% of those assigned a label on their political views say the listing “very accurately or somewhat accurately” describes their views, Pew said.

“These findings relate to some of the biggest issues about technology’s role in society,” said Lee Rainie, director of internet and technology research at Pew Research Center. “They are central to the major discussions about consumer privacy, the role of micro-targeting of advertisements in
commerce and political activity, and the role of algorithms in shaping news and
information systems.”

After a scandal surrounding how Facebook data was used by
firm Cambridge Analytica to influence the 2016 elections, Facebook has promised
to better educate users about how their data is collected and shared. Facebook
is under federal investigation for privacy violations resulting from the
Cambridge Analytica involvement.

Despite these scandals, Facebook actually does provide
“lots of options for users to control their ad preferences,” said Abhishek
Iyer, technical marketing manager, at Cupertino, Calif. Security firm Demisto.
But it apparently has not communicated these tools well enough to users, he
added, if only 1 in 4 users is aware of the ad preferences page.


“If we accept the premise that no ad is really ‘good’ and
businesses driven by ad revenue will always be incentivized in anti-user ways,
Facebook at least tries to give users more preferences through these features,”
he said. “But there seems to be a difference between intent and effect here.”

Facebook said it often receives complaints from users
that ads are not relevant, so it tries to promote useful ads, while not
violating user privacy.

Facebook spokesman Joe Osborne told MarketWatch the
company encourages discussions about ad transparency and controls. The
interests list is generated by user actions on Facebook, like clicking on
certain posts on pages like your favorite sports teams or clicking on other
ads. He said Facebook often receives complaints from users that ads are not
relevant, so the company tries to balance making ads useful while not violating
user privacy.

“We want people to see better ads — it’s a better outcome
for people, businesses, and Facebook when people see ads that are more relevant
to their actual interests,” Osborne said. “One way we do this is by giving
people ways to manage the type of ads they see.”

“Pew’s findings underscore the importance of transparency
and control across the entire ad industry, and the need for more consumer
education around the controls we place at people’s fingertips,” he added. “This
year we’re doing more to make our settings easier to use and hosting more
in-person events on ads and privacy.”

How to change your preferences:

• The list of interests Facebook thinks you have can be
found under Settings>Ads>Your ad preferences. Here, you can eliminate
interests that are not relevant to you or delete all interests to preserve your
privacy. Facebook allows users to modify these preferences by clicking the “x”
button in the upper right-hand corner of the topic itself.

• Facebook has a special section for controlling
advertisements about “sensitive topics,” including “parenting,” “alcohol,” and
“pets.” To access and change these, you can edit them at
Settings>Ads>Your Ad Preferences>Hide ad topics.

• Under “Ad Settings,” users can reject certain invasive
practices from Facebook, refusing to allow it to show ads based on data from
partners, ads based on your activity on other Facebook products (like
Instagram, for example), and ads based on your “social actions,” such as liking
a page.

Facebook still has a ‘multicultural affinities’ listing
on its ad preference page — meant to designate people who likely have an
interest in a racial or ethnic culture, but none for white users.

In addition to the data sharing scandal sparked by the
Cambridge Analytica revelations, Facebook has been accused of violating the
Fair Housing Act in a complaint filed by the U.S. Department of Housing and
Urban Development (HUD).

The department accused Facebook of helping landlords sell
housing to specific demographics based on data it collects. The issue is still
under investigation. At the time, Facebook said in a statement there is “no
place for discrimination” on Facebook.

“Over the past year we’ve strengthened our systems to
further protect against misuse,” the company said. “We’re aware of the
statement of interest filed and will respond in court; we’ll continue working
directly with HUD to address their concerns.”


Facebook still has a “multicultural affinities” listing
on its ad preference page — meant to designate people who likely have an
interest in a racial or ethnic culture, according to Pew. You can check your
own and see what race Facebook advertisers think you are by using the steps
above.

These classifications were often accurate: of those
assigned a multicultural affinity, 60% said they had a “very” or “somewhat”
strong affinity for the group they were assigned, compared with 37% who said
they did not have a strong affinity or interest, and 57% of those assigned a
group said they considered themselves to be a member of that group.


While Facebook has been the target of many investigations
for such practices as of late, it is far from the only company that engages in
these practices, said David Ginsburg, vice president of marketing at security
firm Cavirin.


“It really goes beyond Facebook and privacy,” he said.
“This is no different from privacy agreements that average 2,500 words or more.
The real threat is a society that becomes increasingly fragmented, as
traditional networks and media fall by the wayside. Subscribers are fed only
those ads — and news for that matter — that reinforce their previously-held
beliefs.”


Courtesy:  Marketwatch.com