The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
 
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
 
 
 
 
 
 

A Parent Explains Why They Oppose NY’s ‘SAFE For Kids Act’

DATE POSTED:June 6, 2024

Editor’s note: We’ve written a few times about NY’s “SAFE for Kids Act” and it’s many problems. There’s a decent chance that bill gets voted into law this week. Samuel Johnson posted a wonderfully detailed letter about why he, as a parent, opposes the law, and sent it to his elected officials. He also posted it on his own blog about Upstate NY, and kindly agreed to let us repost it here.

For a number of months now, the New York State Legislature has been kicking around its own internet regulation law. Like similar laws in many other states, the bill is deeply flawed, relying on animosity towards Big Tech and a fundamental misunderstanding of how the internet and computer technology in general works.

Unfortunately, the bill is likely to pass in the next few days. As a parent with many other demands on my time, I was unable to put together a letter outlining my opposition to it until now. Below is the current draft of a letter I intend to send to my state assemblyman, with similar versions to be sent to my senator, the Governor, and others in a position to oppose the bill.

If you are a New Yorker, please reach out to your legislators and the governor. Time is desperately short.

Dear Mr. Steck:

I write to you today to express my concerns about the SAFE for Kids Act currently being considered by the New York State Legislature (and heavily pushed by the Governor). As you know, I am the father of four children, ages six through sixteen, and the increasing difficulty of protecting them as they grow and learn to use the internet has occupied a fair amount of my time and attention over the last decade and a half.

I have been employed full time as a software engineer for sixteen years, the last ten of which have been positions in the “infosec” (Information Security) field, including positions with national security clearance and at international companies like GE. I hold two degrees from RPI, one in Computer Science, the other in Information Technology, and I was on faculty there as an adjunct professor for eight semesters, bringing my professional experience into the classroom to help educate the next generation of engineers and scientists.

While I support the general intent, the SAFE for Kids Act as it is currently drafted will do little to protect children from being exploited by Big Tech. Often, when faced with taking action on a complex issue, people will seize on a partial, or even counterproductive idea, and say “we must do something…this is something, therefore we must do this.” Unfortunately, in this case, this will do little to address the underlying problem, and it will simultaneously expose already marginalized kids (and adults) to greater danger online, and make it more difficult for smaller, independent organizations and businesses to develop alternative, ethical websites and apps.

To understand the problems with the bill we need to consider:

  • The value of the internet as a means of distributing high-quality content and information, and building community
  • The need to support the ability of people to have autonomy over their online identity and experience
  • The need to allow pseudonymous and anonymous use of the internet by both minors and adults
  • The business model of “Big Tech” platforms like Facebook/Instagram and TikTok
  • The distinction between “recommended content” and addictive features

Once we evaluate the bill’s impact with these things in mind, we begin start to understand why the SAFE for Kids Act will make the internet less usable for New Yorkers, no more safe for our children and teens, and more dangerous for members of marginalized communities, while further entrenching the dominance of existing Big Tech and Big Advertising companies.

The Value of the Internet

It would be hard to overstate the value of the internet with respect to its potential for bringing people together and making the world’s knowledge available at low cost to all, at the touch of a button. While misinformation (and even disinformation) have become more widespread in recent years, it’s undeniable that much of the information that used to be available only in print (or radio or TV) is now online. From news, schedules, and weather, to reference material, catalogs of products, and published research, the internet is our go-to medium for finding information.

Most of us have colleagues and friends we’ve met online. We follow the ongoing work of journalists and writers. We enjoy the community created by sharing clips of our favorite sports teams and athletes. We love sharing our hobbies and seeing the work others have created. The internet is especially enjoyable in this respect for people with more niche interests. Someone with a one-in-a-million hobby may only have a handful of similar people in their city, but they can connect online with hundreds or thousands of people with similar passions.

Even more crucially, the ability to organize on the internet allows members of traditionally marginalized populations—from racial minorities to LGBTQ people to those with disabilities or longterm illnesses—to function in a world that all too often wants to shut them out (or worse). The internet allows people to make connections with others who understand their struggles, and possibly more importantly, allows them find life-saving resources. That’s especially true for teenagers in those groups.

Autonomy over Identity and Experience

Twenty years ago, if you wanted to switch to a new cellphone carrier, you had to surrender your number. We recognized that this was not in the interests of anyone but the cellphone companies and created laws guaranteeing our rights to our phone number, regardless of whom we chose as a carrier. We should expect the same kind of autonomy over the identities we assume online, and for the same reason. Our phone numbers form part of our public identity. It’s easy to see why we don’t want a private corporation to exercise veto power over our ability to choose a different carrier (or phone).

We experienced similar frustrations when carriers demanded to decide what brands of cellphones we should be allowed to have. Some people prefer iPhones; others Android (and others neither). A variety of devices are now available across carriers. Typically, if you switch carriers, you can take your phone with you as well as your number. We don’t allow the carrier to dictate the whole phone experience.

We would rightfully object if corporate America attempted to dictate that we could only watch Disney content (including ESPN) on Disney TVs and Netflix content only on a TV sold by Netflix. Why then do we accept that we can only view the content created by our favorite sports teams, celebrities, authors, musicians, and artists using only those apps dictated by Big Tech?

It’s important that we as individuals be able to maintain autonomy over our online identities, making sure that people can follow us to other social media platforms when we leave, just as easily as our friends could reach us on our existing cell phone number when we left AT&T for Verizon. Similarly, it’s important that we be able to control our experience interacting with online content. Why should we accept having to use Meta’s app to view content created and published on Instagram, any more than we would accept having to use Paramount’s TV to watch Syracuse play basketball in March (Paramount owns CBS)?

Anonymous and Pseudonymous Access

We all assume different identities in ordinary life. We often dress differently for work or church or school functions than we do for the gym or a weekend BBQ at the lake. We commonly use titles and last names in formal public settings, while first names are more common among friends and colleagues. We may even be stuck with childhood nicknames with our parents or old friends. Online life shouldn’t have to be different. Potential employers don’t need to see the goofy pictures of my cats that I share with my siblings; my mother shouldn’t expect to scroll through the highly technical work I share with colleagues. Additionally, the ability to exist online under a pseudonym allows members of marginalized populations to ask the more difficult or fraught questions that are nonetheless important, without worry about repercussions from employers or family.

Unfortunately, there are also many cases of ordinary, or even ethical, behavior being punished by families, communities, employers, and the government. Whistle-blowers are often prosecuted (or worse). Union organizers and community organizers are frequent targets of retributive actions by the powerful. Women are threatened and even jailed for seeking basic reproductive healthcare. Victims of domestic violence—both adults and children—are systematically isolated by abusive partners or parents. And teenagers struggling with their sense of identity or sexuality all too often find themselves ostracized, or even cast out, by families whose beliefs don’t include compassion for those unlike themselves.

Every person described in the preceding paragraph has a compelling need to be able to reach out online, whether just for information, or to make contact with organizations in a position to help. But they can’t do that if making the request, or even just running the search, requires use of their legal identity. Anonymous and pseudonymous access to the internet can be (and certainly is) abused. But it’s a literal lifeline for many who are otherwise very alone in a hostile world.

Big Tech” Business Model

“Senator, we sell ads.” Meta founder and CEO Mark Zuckerberg delivered that line in his testimony to the U.S. Senate in April of 2018. It certainly remains true today. Both Meta (Facebook/Instagram/Whatsapp) and Alphabet (Google) derive a huge portion of their revenue by selling ads. They can make billions of dollars selling ads because billions of us spend hours every day using their apps and websites. The more time we spend on their apps and websites, the more ads they can sell.

The fact that we might want to spend our time online on other apps or websites, or that we might want to spend our time not looking at a screen at all, is a threat to their ability to sell ads. Their apps are very carefully and deliberately engineered to maximize our time using them, whether it’s constantly checking for new “likes,” endlessly scrolling for new content, or angrily commenting on someone else’s hateful post. Big Tech and Big Ad don’t care that it might be bad for us: they want our engagement and attention.

An app that might allow us to view content shared by our favorite celebrity, sports team, musical group, artist, author, or even just a local business encouraging us to try their new taco special this Tuesday, without us seeing an advertisement, is a threat to their business.

Recommended content vs. addictive features

One of the great promises of computers is that they would relieve us of some monotonous tasks and drudgery. To that end, websites and apps that offload some of the more mundane tasks to an algorithm can be extremely helpful. That can include basics like spam filtering or sorting emails into folders based on sender and subject line. It can include using geographical context: when you search for Paesan’s Pizza, you want the one in Colonie or Latham, not the similarly named businesses in Pennsylvania or Indiana. It can include things like recommending the next book in a series when you’ve just checked out the previous book, or finding reviews for products you’ve looked at, or even similar alternative products others have purchased.

Like so many things, websites and apps can deploy such algorithms to exploit their users. Some websites or apps are explicitly constructed to trigger the same cognitive impulses as a casino slot machine or carnival barker. But would we honestly want to use an app or website that is prohibited from serving us helpful content? There is a distinct difference between a recommendation and a sales pitch.

SAFE for Kids Act

With all that in mind, we can begin to consider the SAFE for Kids Act. At its core, it purports to address that last issue: addictive features. Unfortunately, its fundamental definition begs the question. The act defines “addictive feed” in such a way that it captures much of what we expect a modern website or app to do. Just about every useful thing described in the previous section qualifies as an “addictive feed” under the bill’s definition. After the definition’s first word, the bill makes no mention of any addictive feature or property.

Even if the definition were productive, the bill doesn’t actually require platforms to provide an environment free of addictive features (or even access to the content by other apps that might not have the same addictive features). It simply allows access with parental permission. We’ve all clicked “agree” countless times for countless apps and websites. Why would parents behave any differently here? The bill does require that apps provide parents with the means to restrict an app’s ability to send notifications in the middle of the night. Unfortunately, it doesn’t require that option to be available for everyone, only “covered minors.” Parents who might be in need of sleep are left out. If we want to maintain some semblance of control over our own online experience, this bill will not help us.

The bill requires that anyone providing an “addictive feed” (which, remember, includes just about any modern website or app) must use “commercially reasonable methods to determine” if a user is a minor. Unfortunately, there is no technically feasible way to apply that test only to minors: every New Yorker will be required to verify their age.

Age verification requires identify verification. While the bill requires that “information collected for the purpose of determining a covered user’s age…shall not be used for any purpose other than age determination,” there isn’t existing technology to support the requirement. Any age verification service will need to send information to any website requesting the information. The verification service will then have a record of what website or app the person is using. The age verification service might not be covered by New York Law. Indeed, one of the leading “commercially reasonable methods” for verifying a user’s age is provided by MindGeek, the Canadian company best known as the owner and operator of PornHub. I don’t particularly want to give them my information, let alone that of my children, in order to sign up for services and apps online.

The bill effectively removes the ability of New Yorkers of any age to sign up for websites and apps anonymously or pseudonymously. As we noted earlier, it’s vital—in some cases a matter of life and death—that this ability be preserved to protect already marginalized people.

These requirements: the need to verify a user’s age, the need to provide the functionality to opt-out, and the need to provide parents with the ability to change notification settings based on a user’s age and the time of day, will be relatively trivial for multi-billion dollar companies like Meta and ByteDance (the owner of TikTok). But in order for the internet to continue to be a source of high-quality information, and a tool to build communities among real people, we need smaller entities to build and operate websites and apps. These requirements will likely be prohibitively expensive for any number of community groups, non-profits, political campaigns, local churches, and small businesses who want to provide an alternative to the ad-driven, attention-seeking commercial products provided by Big Tech. Rather than protecting our kids (and all New Yorkers), the SAFE for Kids Act will only serve to further entrench the very companies we need to keep in check.

While the goal of liberating our children and teenagers from websites and apps that have been meticulously engineered to capture their attention is a noble one, this bill falls short. It attempts to solve a very real problem. Clearly, we must do something. This is something. But very clearly, we must not do this. Please oppose the SAFE for Kids Act in its current form.

Sincerely,

Samuel B. Johnson