Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

Welcome to Mark Zuckerberg’s New Dystopia: the Metaverse

Americans are familiar with the idea of a dystopian future, dominated by an inescapable metaverse and ruled over by Big Tech overlords. Films such as “Blade Runner” and “Minority Report” depict a world conquered by technology and the terrifying consequences.

Facebook founder and CEO Mark Zuckerberg recently announced that he wants to take that world from the screen and make it our own.

“Facebook is making itself bigger. It’s basically making an incursion into your everyday life with this metaverse conception,” explains Kara Frederick, research fellow in technology policy at The Heritage Foundation, parent organization of The Daily Signal. “This is going to be, in my mind, a totalization of control of your life in the future.”

Frederick joins “The Daily Signal Podcast” to discuss Zuckerberg, the metaverse, and how dire the consequences might be if Big Tech is allowed to take over our lives.

We also cover these stories:

  • The Supreme Court prepares to hears oral arguments in Dobbs v. Jackson Women’s Health Organization, the abortion case that could overturn Roe v. Wade.
  • Former White House chief of staff Mark Meadows reaches an initial agreement to cooperate with a House select committee investigating the Jan. 6 riot at the Capitol.
  • New York City becomes the first American city to open government-sanctioned drug-injection sites for addicts.

Listen to the podcast below or read the lightly edited transcript.

Doug Blair: Our guest today is Kara Frederick, a Heritage Foundation research fellow in technology policy, focusing on Big Tech. She was also a member of Facebook’s Global Security Counterterrorism Analysis Program and a team lead for Facebook Headquarters’ Regional Intelligence Team in Menlo Park, California. Kara, welcome so much to the show.

Kara Frederick: Thanks for having me.

Blair: Awesome. Facebook founder Mark Zuckerberg recently announced the creation of Meta, which is a new company that will serve as the parent company for Facebook, Instagram, WhatsApp, and a variety of other applications underneath that sort of brand. Practically, what does Meta mean for these platforms? What has changed now?

Frederick: Yeah, I don’t know if “practically” is the right word for it, but the idea of the metaverse originally came from a dystopian sci-fi novel in the ’90s, “Snow Crash,” and this was supposed to be a virtual world that should terrify the reader. It probably did terrify a lot of the readers in those days, and now it’s terrifying us in real life.

So what it actually means for all of the applications and the companies that Facebook has really gobbled up like WhatsApp, like Instagram, all of these companies that they own, and that is now subsumed under Meta, is that Facebook is only going to grow. So not the regular platform of Facebook. That’s going to retain its name, but it’s expanding.

Zuckerberg’s been really, really good at diversifying. So he saw something great in Kevin Systrom with Instagram, he saw something great when it came to WhatsApp and Brian Acton and Jan Koum and what they had done with their encrypted messaging platform. And he recognized that this is what Facebook needed to become, something akin to Google’s Alphabet.

If you remember, when they rebranded themselves, as you had Google Search and then you had YouTube, which was a video platform, which sort of grew out of, remember, Vine [with] those short videos, and now Facebook is sort of copying that with Reels.

So Facebook is making itself bigger. It’s basically making an incursion into your everyday life with this metaverse conception. And I think it bears sort of defining what this concept of a metaverse is.

And I’ve heard it talked about as a virtual world, but a way to think about it is, it’s a mix of augmented reality, virtual reality, and gaming. They’re also trying to institute some NFTs, these nonfungible tokens, where it’s sort of like the digital deeds to digital assets. So all sorts of completely not just immersive, but a digitization of everything.

The Wall Street Journal, I think, put it in plain terms when they said in the metaverse you’re going to be able to walk on the moon in your pajamas. So think about that sort of virtual world where you can—Erika Nardini has talked about this. You can try on clothes, as some younger kids do on specific platforms that are already engaging in this metaverse concept.

You can own your own virtual house. You can, what Zuckerberg has actually laid out a vision for, you can worship in the metaverse. So instead of going to church … physically and engaging in that body-soul composite that most, at least a Judeo-Christian understanding of what worship consists of, no, you will only do it virtually. And in some instances, you’ll do it without legs, with your half-digital avatars that Zuckerberg himself displayed when he was rolling out this concept.

So it’s that all-encompassing virtual world. Yeah, there’s going to be a mix of what is meet space, the physical world and the virtual world. That’s the augmented reality concept. But pretty much this is going to be, in my mind, a totalization of control of your life in the future. So it basically, in a nutshell, means a lot for Facebook.

Blair: Yeah. I mean, that is a lot to unpack here. I mean, it sounds like this is a comprehensive reform of just how people would live their everyday lives. You mentioned virtual reality, augmented reality. That’s not something that you stop at the home. That’s something that extends out into the real world itself. Given that there is so much going on here, what is Mark Zuckerberg’s rationale? What benefit does this explicitly give Zuckerberg?

Frederick: I think you might have seen some reports, especially … events in The Wall Street Journal exposé, which is, remember, the whistleblower’s documents, she leaked a trove of documents, a lot of internal research that Facebook took it upon itself to conduct and the results therein.

So out of this Wall Street Journal exposé, we sort of found, yes, these platforms are toxic to users, female users, teens, in particular, you saw all the stats on that. One in three, Instagram made American teenage women’s body image worse, that kind of thing.

And you can go and look back at The Wall Street Journal. They have all of those details listed and encompassed in their Facebook files documents. So I think that is worthwhile.

So besides that, what came out of that trove of documents was the fact that Facebook is hemorrhaging their teen users. And they’re also targeting preteens specifically because they know they’re losing their purchase on this demographic. They said tweens were sort of ripe as a market demographic for them, as a target.

Now Zuckerberg has come out and said, “We think young adults are our North Star.” So this is 18- to 29-year-olds he’s talking about in this instance. Think of people who look at Barstool Sports. I mentioned the CEO, Erika Nardini, those people that they’re getting on TikTok, and that are sort of surging ahead when it comes to capturing that base.

But, that hemorrhaging that I talked about, Facebook’s internal research from those Wall Street Journal whistleblower documents, they basically said that they were expected to lose an additional 45% of their teen usership by 2023.

So they know that they’re losing it to Snapchat, rather to TikTok, to other platforms that are more popular with the young kids these days. And they know that they need to appeal to them to a different degree. And if that equals gaming, so things people that are younger, people are more interested in that kind of thing, then they’re sort of leaning into that. The whole NFT concept that people are so excited about for now, they’re leaning into that as well.

So the metaverse, hopefully, in Zuckerberg’s mind, I believe is going to sort of scoop up the people that they’re losing on their Facebook-named regular social media platform.

Blair: Interesting. I’m curious, actually, I’m really glad that you mentioned that report that came out about Instagram specifically, that it was reported that these platforms are toxic for mental health, specifically for younger girls. Do you think that maybe this metaverse will also have that impact on people? We’ve seen with the current tech platforms that it has this impact. How does that affect people’s mental health now that it’s just everywhere?

Frederick: Oh, yeah. I don’t see that it can’t. In fact, I think this is the next step in the dehumanization process when it comes to, like I said before, the totalization of control.

You keep your entire life in the digital world. What does that mean? It means it can more easily be controlled by, especially, companies that have these great concentrations of power and have the proven capacity and willingness to really abuse that power that does not necessarily redound to the benefit of human flourishing.

And yeah, that’s laid bare by the trove of these documents where you see the impact that it’s having on their children. Facebook itself knows, from 2019 to 2022, the impact, the toxic impact that it’s having on teen girls, they’re very aware of this.

So I think this is just another supercharged step in that process when you’re talking about, instead of people going to church and building community, you have that ersatz conception of what a community is, if you worship digitally online through your digital avatar.

And I would like to come back to this concept that I think conservatives should be very interested in, that body-soul composite concept that is deeply embedded in Judeo-Christian philosophy and ideology.

We’re not just our digital avatars, we’re not just our bodies, we’re not just our souls, we’re both. And when you take the soul out of it, when you take the body out of it physically too, then what are we but these just digital floating creatures with no real links to each other?

And my mom used to tell me, “Life is all about the people,” right? And when you’re in the metaverse, it’s not about the people at all. It’s about—I don’t even know what it’s about, but it’s not real.

So I think, yeah, toxicity in terms of hard data and whatnot. Yeah. We’d absolutely see the effects of what worshiping online looks like. Look at the lockdowns. A lot of people have already fallen away from church and when they just worship on a screen, we all know that that is not fulfilling. That that is not what we were made for. And it is just this replacement. It’s a poor version of what I think Zuckerberg and all of these technologists espousing techno-solutionism to all of our problems are promising.

Blair: And that is a fascinating point. And I hadn’t even thought of this concept. We were talking about this a little bit earlier before we started recording, but the idea of a kind of cyberpunk-esque, like, you are no longer a person, you are part of the machine, you are part of this big technological space now. And I just think that is horribly scary.

Frederick: Exactly. And I am not somebody who thinks we should return to scratching a living out of caves and necessarily [get rid of] all technology whatsoever. I do think we have the return component is important and we should sort of return to what as conservatives especially made us successful in this country.

Family, faith, all of those foundations, extremely important, not decrying that whatsoever, but I do think that there is a way to sort of dig out of this through technology. And that’s to make sure that technologies are decentralized. To make sure that the user has more control rather than this top-down concentration of power, visiting whatever ad hoc, vague rule they want on the users of their platform. And I think, we talked about practicality in the beginning, that has a very practical application when it comes to policy now.

These companies should not be aggrandizing everything to themselves and then abusing that concentration of power, especially when the individual, the little guy wants to be able to speak freely, they want to be able to voice their political viewpoints.

We’ve seen, especially since the election cycle of 2020, how that can be used to put the thumb on the scale of our electoral process—I’ll go into some of the data that the Media Research Center uncovered. And this was as early as November 2020 after the Hunter Biden, New York Post laptop story was actively suppressed on Twitter and Facebook.

Media Research Center found that 1 in 6 Biden voters would’ve changed their vote had they been aware of information that was actively suppressed by these companies.

So recently, McLaughlin & Associates came out with another survey where they said 52% of Americans thought that social media companies in particular, their censoring of the Hunter Biden laptop story constituted election interference.

So we’re seeing how these Big Tech companies actually abuse their power. And I don’t see any reason for them to stop doing it if conservatives don’t say, “Enough is enough,” especially as all of life starts to bleed into the metaverse, which people are pushing from a business perspective.

There’s a study that I’m probably going to fumble right now, but the bottom line is the metaverse was mentioned seven times in the press last year. And now recently it’s been, I mean, that has increased just, I don’t want to say tenfold because I’m not quite sure the numbers, but everybody’s talking about it to the tune of over a hundred, at least, mentions of the metaverse now.

Everybody wants a piece because they see that this is in fact the future. But I think we have a duty to sort of resist it and move toward the more privacy preserving technologies that are going to allow people to talk without worries of self-censorship or censorship from the Big Tech companies coming down upon them.

Blair: Privacy was definitely a concern that I thought about when I was researching this topic. I mean … let’s go back to cyberpunk again. One of the movies that it really evoked for me was “Minority Report,” where this idea of this omnipresent surveillance system that is funded and powered by a system of tech and augmented reality that’s just inescapable. I mean, are we at that level? Is that even possible? Or is that the kind of privacy concerns we’re talking about?

Frederick: So, we’re not there yet. When I think people want to see a demonstration of the bleeding edge of these digital surveillance tools turn inward on their populations, clearly, take a look at China, take a look at what they’re doing, especially in Xinjiang.

Much has been made of the human rights abuses, the genocide that’s occurring there against the minority Uyghur population, very important. But the fact that this genocide is tech-enabled should raise people’s eyebrows and I think should get a lot of attention. It doesn’t in some circles, I would say in my circles, but not necessarily among the greater American populous.

And one thing that Human Rights Watch actually did, a researcher there helped reverse-engineer what was called an integrated joint operations platform. And what that basically did was give officials the opportunity in the palm of their hands to integrate data like how much gas or electricity is used in a house, even how much toilet paper is used, what doors are used. And it all sits there in their palm. And they’re able to make assessments from all of that data.

There’s data doors that grab information off SIM cards when you pass through them, all sorts of tech-enabled surveillance that are really a warning sign to freedom-lovers everywhere.

And not just that, but the social credit system. This was originally pilot programs that were more binary. They had black lists and red lists of people if they sort of displeased the government, if they jaywalked in some way, then they would be prevented from having access to certain services. There’s ways that they don’t get government subsidies if they’re on a black list. They’re not allowed to buy train tickets if they’re on a black list. All in the digital world.

So there’s still a lot of back and forth among researchers on how heavily something like artificial intelligence factors into these determinations. But we do know that at least the seeds of these pilot programs are there.

And if they’re able to sort of fuse disparate data sets together and integrate that data and parse through it with new technologies like artificial intelligence, machine learning and develop insights, then that potentially gives them the ability to scrutinize dissenters, to scrutinize what would be enemies of the state.

So if anyone wants to see what a surveillance state that’s terrifying looks like, just look to China.

What troubles me the most when you say, “Are we there?” I’m starting to see the surveillance tools turned inward on the American population, or at least the seeds of this sort of blooming now, which is extremely troubling to me.

And I think this is taking place already in the physical world. You look at the National School Boards Association letter and “domestic terrorist,” right? And now the tags of terrorism that are being used to label parents who don’t buy into the critical race theory teachings in our public schools. That, to me, when you start to use counterterrorism tools and look inward at a domestic U.S. population, what will easily follow is what these tech companies are doing.

And I, myself, having worked on the counterterrorism analysis team for global security at Facebook, you’re sort of starting to see that happen in the digital world as well.

If you look at the Global Internet Forum [to Counter Terrorism], GIFCT, they have, and this was something, they have former Facebook employees there and people who have worked the digital counterterrorism problem for a while. They’re now expanding their database to include far-right, or at least right-leaning, what they call extremism.

And they issued a report, I believe it was last year, that said that this report from GIFCT wants to include white supremacists’ content and whatnot in order to right the wrongs of bias and discrimination in the counterterrorism profession in part.

So when you see white supremacy, domestic extremism, right-leaning terrorism sort of thrown about, my mind goes to, “OK, this is what happened when … one part of our counterterrorism apparatus in the justice system was used within five days of that National School Boards Association letter by the Department of Justice to target regular parents, people in the mainstream.”

You wouldn’t even consider some of these people conservatives. A lot of people, especially those who maybe voted for [Glenn] Youngkin based on his anti-critical race theory stamps, these people would consider themselves regular Democrats, maybe even left-of-center originally. But when you start to expand that pool of potential domestic extremists to encompass run-of-the-mill conservative content and even stuff that’s farther to the left than that, then I think we’re in a troubling space.

I’m starting to see the manifestations of that in the digital world, among tech companies. And that worries me gravely.

Blair: I mean, it sounds horrific. It sounds really, really scary what we’re in for possibly. But we are at, I think, the point where Pandora’s box has been opened. You mentioned at the very beginning, you’re not in favor of stuffing the tech back in the box, it’s out there … we’re using it. You can’t just put it away. Given that there are tech companies that are using this power to surveil American citizens and to censor and there’s governments that you’ve seen in China that are also using this technology to the detriment of their citizens, what is the policy prescription to make sure that doesn’t happen here?

Frederick: Yep. That’s a great question. But first, I want to say something that I forgot to say initially, it’s the symbiosis between the federal government and these Big Tech companies that is also very, very worrying to me. And the actual data point that we have for that is [White House press secretary] Jen Psaki.

When she stood up at the White House podium and said there are 12 people on Facebook and these social media platforms that purvey 65% of the disinformation on, I think she was talking about COVID misinformation, which we know is a catch-all at this point for anything that disagrees with the leftist narratives, being promulgated by Psaki herself and [President] Joe Biden and everyone who basically pulls that line.

So when they say that and within a month, Facebook takes down all of those accounts, there’s a problem there.

So that increasing symbiosis between organs of the state and these private companies, really troubling. And we don’t even have to get into potential end runs around the Fourth Amendment when it comes to outsourcing surveillance to companies like Clearview AI, using their facial recognition software to actually put U.S. citizens in jail for walking through the Capitol on Jan. 6, that kind of thing.

That’s a whole ‘nother can of worms, but it all is part and parcel to the fact that these tech companies are being directly responsive to organs of the state, very troubling.

So in terms of policy prescription, I might go out on a limb here, but I don’t think that all of the solutions lie in Washington, D.C. As our new incoming president says, “You have to have a beachhead here in D.C.” And there is a role for the Congress and federal government to play in ameliorating some of the effects that Big Tech has had on conservatives and the American people in particular, but solutions need to be found outside of the District, in my mind. And that starts with a return to federalism.

I think some of the states are positing some great legislative proposals. You even had [Gov.] Ron DeSantis in Florida institute in his election security bill that no Zuckerbucks were allowed in Florida. So Mark Zuckerberg can’t seed those hundreds of millions of dollars into specific local elections like he did in the 2020 election.

Additionally, you have certain AGs like Ohio suing, saying Facebook really constituted a breach of contract when they promise one thing, but the platform delivers another, which is toxicity to teen users. There’s a common adage in the tech policy community, where if the product is free, you are the product. So that’s what the AG in Ohio alluded to as well.

So states are taking it upon themselves, Texas is doing this, to rein in the power of Big Tech. That’s one way of looking at the problem.

If you want to get into the policy wonkiness, federal government stuff, we at The Heritage Foundation believe in the reform of Section 230 of the 1996 Communications Decency Act, which basically, in a very broad nutshell, relieves these tech companies of liability for content posted on their platform.

So the publisher versus platform debate. Section 230 says that these are platforms and not publishers. We argue that in the instances where these tech platforms act as publishers like Twitter, editorializing on it’s feed, then I think they should be stripped of their immunity from lawsuits effectively.

So there’s been talk about private right of action, letting the people actually sue these tech companies when they are acting as publishers and not platforms, when they’re influencing the information that you see in a way that newsrooms do.

So definitely like to reform Section 230 to align with Congress’ original intent. You know, these were the 26 words that created the internet. Great. We love the flourishing. We love to see a genuine free market. We want these companies to compete and let the best one win. But … [they] have an unfair advantage and Section 230 is the government intervention. So let’s readdress that.

And then I think civil society is critical here too. Grassroots are extremely important. Again, parents looking at this need to say, “Enough is enough.” Let’s generate the enthusiasm that the anti-critical race theory movement generated among the grassroots. I think when parents lobby their states and their state legislature to rein in these companies, and not just in the red states, but in other states as well, that would be a great thing.

And they need to push these companies to enact more transparency because that’s one of the biggest problems, the lack of recourse and the lack of transparency in these companies based off of their behavior, their practices. They’re not going to tell you anything if they don’t have to tell you anything.

There’ve been a couple instances, I would actually even say that Facebook has been fairly decent about issuing transparency reports, but at the same time, they don’t tell you much, they throw you a bone, they check the block, and you’re able to see what they want you to see.

And then finally, and I would argue most importantly, which hearkens back to what we’ve talked about before, is building anew. And [it’s] going to be very difficult. First-mover advantage, network effects, these are huge when it comes to technology companies, right? There’s a technical component to it as well.

You look at Google, they’ve been able to collect a high volume and variety of data for years and years and years before most search companies. If they’re getting started today, they’re able to refine their algorithms and give you a great product because they’ve had the time to do it. And that technical advantage as well.

But I think conservators have to start somewhere. You look at specific companies like Peter Thiel, J.D. Vance investing in Rumble, which is a YouTube competitor, great. That allows conservatives to at least have redundancy. So if they have a video on YouTube, that’s going to be pulled down, put that video on Rumble as well.

And I’m not stumping for a specific company, but the alternatives that actually contribute to what should be a genuine free market would be useful.

Martin Avila is doing this with Right Forge. They’re creating a cloud hosting infrastructure and services so that something like what happened with Amazon Web Services and Parler can’t happen again. Because we know that the real story behind Parler wasn’t Google or Apple kicking them off at the higher level of the stack, at that application layer, digital platform layer. It was actually at more of the mid-tier layer of the stack when cloud hosting services by Amazon were pulled, Amazon Web Services were pulled from Parler, that it was lights out for them and the platform as it was originally conceived didn’t exist.

So that’s huge to get into the middle part of the digital stack and create those services too.

I would argue conservatives have to look more at the foundational layers of the stack as well, like internet service providers, because take the Texas anti-abortion law, look at GoDaddy. GoDaddy refused to host websites that were supporting that Texas anti-abortion law because of ideological perspectives. … I mean, come on, that is exactly why they did it.

So if we don’t have people who hue to the real true democratic free speech promises of technology and create those internet service providers with the real devotion to freedom of expression, we’re going to be lost. The GoDaddy—it’s going to happen again. AWS is going to happen again.

We’ve seen this at smaller eCommerce levels as well. Shopify refusing to host any of [Donald] Trump’s merchandise after Jan. 6. You’ve seen it with Kickstarter, an online fundraising platform, refusing to host more conservative-minded films.

You’ve seen this with PayPal, they partnered with the Anti-Defamation League and they are, again, targeting domestic extremism. We know now what that means. It doesn’t necessarily mean the real bad guys. It means conservatives or people who don’t buy into leftist narratives.

So there’s a whole digital world out there that conservatives need to be wary of getting kicked off. Clearly, they’re constricting our digital lives and life has digital characteristics now. … If we have something like the metaverse, think about what that aspect of control can look like.

So conservatives need to build with the full stack in mind and look at the IP level, look at cloud hosting services, look at digital platforms themselves, and just start building anew at the very least for that redundancy capability so you don’t lose everything you have by relying on companies that can pull the rug out from under you in a second because of ideological differences.

Blair: Excellent. Before we wrap-up, I wanted to quickly ask, is there a website or a resource that you recommend our listeners check out if they want to learn more about this topic?

Frederick: Yeah, sure. We’ve consolidated most of the work in our commentary on Big Tech at heritage.org/bigtech. So take a look at our technology tab on the heritage.org website, and you’ll see some of our writings and most of the commentary that we have. We have a—whet your appetite for a future publication. So we’ve got some of those coming out.

And now that we fully built out our Center for Technology Policy, we are firing on all cylinders. So stay tuned. Heritage.org is where you can find a lot of our work and yeah, come join us. Come help us in this fight because it’s going to be one long into the future.

Blair: Excellent. Well, thank you so much. That was Kara Frederick, a Heritage Foundation research fellow in technology policy, focusing on Big Tech, as well as a former member of Facebook’s Global Security Counterterrorism Analysis Program and a former team lead for Facebook Headquarters’ Regional Intelligence Team in Menlo Park, California. Kara, it was fantastic having you on the show.

Frederick: Always. Thanks, Doug.

Have an opinion about this article? To sound off, please email letters@DailySignal.com and we’ll consider publishing your edited remarks in our regular “We Hear You” feature. Remember to include the URL or headline of the article plus your name and town and/or state. 

The post Welcome to Mark Zuckerberg’s New Dystopia: the Metaverse appeared first on The Daily Signal.

Enregistrer un commentaire

0 Commentaires