Elon Musk makes $43 Billion offer for private buyout of Twitter (3 Viewers)

Any sort of content moderation is going to have to come from the platforms themselves because the First Amendment just makes it hard to for the government to be the arbiter of content. Then, of course, when they do it, it opens those platforms up to criticism from those who don't like being moderated.

It's this weird combination where our First Amendment allows us to say these things and is, in part, what keeps us from seeing some of the crack-downs that our countries have done on misinformation online . . . and that has allowed for the amplification of some pretty nasty and untrue ideas that can actually harm our communities. But that same wide scope of protection keeps us free to be critical of government, which is a critical feature distinguishing our society from more controlled, even autocratic nations.

It's a really challenging issue, and deep-fake AI-assisted content is only going to make it harder. Either the Supreme Court evolves our First Amendment standard to include some kind of exception for deliberately false, harmful information (but again, it's much easier to say than to faithfully perform) or we continue to have to live with this tradeoff.
It used to be that free speech ended right around the analogy of walking into a crowded theater and shouting "fire" when there was no fire. Now I'm not sure even that low bar hasn't been dropped. Part of the problem is that our population has no sense of context. Many take at face value some idiot's post on FB about how great it was 4 years a go when gas was 1.99, but lose the context of a pandemic lockdown in place. Said idiot garnishes hundreds or thousands of "likes" from other idiots.
 
Part of the problem is that our population has no sense of context. Many take at face value some idiot's post on FB about how great it was 4 years a go when gas was 1.99, but lose the context of a pandemic lockdown in place. Said idiot garnishes hundreds or thousands of "likes" from other idiots.
Entire world’s knowledge at our fingertips and we’ve lost our curiosity. When I hear some outrageous crap like “they’re eating pets”, my first reaction isn’t to go spread that with my neighbors. I realize it doesn’t sound right so I take the 15 seconds to go look it up. Sometimes the wild thing is true and I learn something, sometimes there’s just missing context, and most of the time it’s just silly.
 
Entire world’s knowledge at our fingertips and we’ve lost our curiosity. When I hear some outrageous crap like “they’re eating pets”, my first reaction isn’t to go spread that with my neighbors. I realize it doesn’t sound right so I take the 15 seconds to go look it up. Sometimes the wild thing is true and I learn something, sometimes there’s just missing context, and most of the time it’s just silly.
But you and Terps are assuming that understanding is the goal
But it really doesn’t seem to be
A goodly chunk of people are mad, unfulfilled, confused and they crave a target for that - some scapegoat to attach those feelings to
The ridiculousness seems to be a feature not a bug
 
It used to be that free speech ended right around the analogy of walking into a crowded theater and shouting "fire" when there was no fire. Now I'm not sure even that low bar hasn't been dropped. Part of the problem is that our population has no sense of context. Many take at face value some idiot's post on FB about how great it was 4 years a go when gas was 1.99, but lose the context of a pandemic lockdown in place. Said idiot garnishes hundreds or thousands of "likes" from other idiots.

The thing is, 'you can't shout fire in a crowded theater' never really was the law - that was dicta from a Supreme Court case in 1921 about the Espionage Act and Justice Holmes was musing generally about the limits of free speech in a case where the Court upheld the arrest for violating the act with conduct (demonstrating against the draft) that would easily be considered free political speech by modern standards. I think it has stuck in the minds of people for generations because it's easy to understand and it's sensible . . . but it isn't really accurate as to the First Amendment when it comes to problematic misinformation. It's highly contextual.

The actual legal standard comes from a 1969 case that involved the arrest of KKK members in Ohio. In Brandenburg, the Court ruled that the standard for when violent speech is no longer protected by the First Amendment is when it is “directed to inciting or producing imminent lawless action and is likely to incite or produce such action”. Imminent is the key term - it can't be some generalized notion of violence.
 
But you and Terps are assuming that understanding is the goal
But it really doesn’t seem to be
A goodly chunk of people are mad, unfulfilled, confused and they crave a target for that - some scapegoat to attach those feelings to
The ridiculousness seems to be a feature not a bug

I think we could argue that above is the FOX networks mission statement....

Well said....
 
The thing is, 'you can't shout fire in a crowded theater' never really was the law - that was dicta from a Supreme Court case in 1921 about the Espionage Act and Justice Holmes was musing generally about the limits of free speech in a case where the Court upheld the arrest for violating the act with conduct (demonstrating against the draft) that would easily be considered free political speech by modern standards. I think it has stuck in the minds of people for generations because it's easy to understand and it's sensible . . . but it isn't really accurate as to the First Amendment when it comes to problematic misinformation. It's highly contextual.

The actual legal standard comes from a 1969 case that involved the arrest of KKK members in Ohio. In Brandenburg, the Court ruled that the standard for when violent speech is no longer protected by the First Amendment is when it is “directed to inciting or producing imminent lawless action and is likely to incite or produce such action”. Imminent is the key term - it can't be some generalized notion of violence.
In your opinion, could the same reasoning you displayed in the last paragraph be used for online lies? (I hate the terms misinformation or disinformation - they're lies)

Online lies have incited an insurrection, caused a pizza place in NJ to have armed men show up to stop a nonexistent human trafficking ring that they didnt run out of a basement the building didn't have. And most recently, online lies were amplified so much that LEGAL Haitians immigrants are living in fear and having to close public schools - a constitutional right (14th amendment though not explicit) mind you.

At what point can the case made that online lies do reach the level? How many people need to be radicalized online and act violently before online lies are considered "directed to inciting or producing imminent lawless action and is likely to incite or produce such action”?

I say now is the time to hold people responsible. Otherwise, we are going to be waiting for a tragedy so profound it will force the change. I'd rather not have a bunch of people die for us to realize you should have at least as much responsibility for the words that come off your keyboard as the ones that come out of your mouth.
 
In your opinion, could the same reasoning you displayed in the last paragraph be used for online lies? (I hate the terms misinformation or disinformation - they're lies)

Online lies have incited an insurrection, caused a pizza place in NJ to have armed men show up to stop a nonexistent human trafficking ring that they didnt run out of a basement the building didn't have. And most recently, online lies were amplified so much that LEGAL Haitians immigrants are living in fear and having to close public schools - a constitutional right (14th amendment though not explicit) mind you.

At what point can the case made that online lies do reach the level? How many people need to be radicalized online and act violently before online lies are considered "directed to inciting or producing imminent lawless action and is likely to incite or produce such action”?

I say now is the time to hold people responsible. Otherwise, we are going to be waiting for a tragedy so profound it will force the change. I'd rather not have a bunch of people die for us to realize you should have at least as much responsibility for the words that come off your keyboard as the ones that come out of your mouth.
I’m with you that there should be accountability- but it’s probably the platform as a business entity that should be held responsible (ie getting an actual FCC with teeth again)
Individually ‘commentators’ can claim ‘entertainment’
And I’m not as strident about punishing ‘entertainment’ or ‘satire’
(Should the original War of the Worlds be held liable?)
But they should come with giant warning labels like cigarettes have
 
I’m with you that there should be accountability- but it’s probably the platform as a business entity that should be held responsible (ie getting an actual FCC with teeth again)
Individually ‘commentators’ can claim ‘entertainment’
And I’m not as strident about punishing ‘entertainment’ or ‘satire’
(Should the original War of the Worlds be held liable?)
But they should come with giant warning labels like cigarettes have
In your analogy, people were warned/informed that the War of the Worlds broadcast was fictional twice during that broadcast. Just like most of us ignore the warnings on a pack of cigarettes. FWIW, I don’t like that someone can hide behind those warnings. “This is a joke people but there is a bomb in this airport.”

At some point we have to draw a line. We all know where the line is but how do you draw it without destroying our freedoms? You can’t censor the late night talk shows from making witty observations about this or that. You can’t censor an opinion column in the press.

A true democracy demands that you use common sense and decency in your daily interactions. We are trying but we aren’t there yet.
 
In your opinion, could the same reasoning you displayed in the last paragraph be used for online lies? (I hate the terms misinformation or disinformation - they're lies)

Online lies have incited an insurrection, caused a pizza place in NJ to have armed men show up to stop a nonexistent human trafficking ring that they didnt run out of a basement the building didn't have. And most recently, online lies were amplified so much that LEGAL Haitians immigrants are living in fear and having to close public schools - a constitutional right (14th amendment though not explicit) mind you.

At what point can the case made that online lies do reach the level? How many people need to be radicalized online and act violently before online lies are considered "directed to inciting or producing imminent lawless action and is likely to incite or produce such action”?

I say now is the time to hold people responsible. Otherwise, we are going to be waiting for a tragedy so profound it will force the change. I'd rather not have a bunch of people die for us to realize you should have at least as much responsibility for the words that come off your keyboard as the ones that come out of your mouth.

The bar to clear is really high in terms of the incitation of violence has to be for an imminent and specific thing. Lies about Haitian immigrants almost certainly don't meet it because there is no specific call to action and none of it is a call for imminent violence. Maybe there is some that would qualify, but it's going to be an examination on case by case basis. SCOTUS generally is not friendly to prior restraint on speech. Particularly political speech which that stuff probably fits into. Traditionally, SCOTUS allows restriction on the time, place, and manner of speech, but no on the content of speech no matter how vile it might be.

But, this is even more complicated when you talk about online speech because it's done on private platforms when the 1st Amendment doesn't apply. For any Constitutional right to apply there must be state action. That concept was stretched quite in the past when the Court wanted to extend protection (civil rights cases in the 60s and 70s for instance), but I tend to doubt it would be found to apply to a private platform restricting speech. They do it all the time and have never to my knowledge been stopped on Constitutional grounds. So, the platforms could take it down or ban those users, but several of them lack the desire to do so for various reasons. At the same point, if the government attempted to tell those platforms what speech should or should not be allowed the Constitution would almost certainly prevent them from doing so unless it is truly incitation of imminent violence, i.e. "There is a Haitian wearing jeans and Sponge Bob T-Shirt on the corner of 4th and Red street in Brandenburgh, Ohio, I am calling on anyone in the area to kill him before he eats another pet" would likely not get Constitutional protection.
 
But you and Terps are assuming that understanding is the goal
But it really doesn’t seem to be
A goodly chunk of people are mad, unfulfilled, confused and they crave a target for that - some scapegoat to attach those feelings to
The ridiculousness seems to be a feature not a bug

The term Gaslit Nation comes to mind.
 
I’m with you that there should be accountability- but it’s probably the platform as a business entity that should be held responsible (ie getting an actual FCC with teeth again)
Individually ‘commentators’ can claim ‘entertainment’
And I’m not as strident about punishing ‘entertainment’ or ‘satire’
(Should the original War of the Worlds be held liable?)
But they should come with giant warning labels like cigarettes have

The problem with getting the FCC involved is that then there is state action and the Constitution applies. Which means that most speech outside of very specific calls for imminent violence would be protected.
 
I’m with you that there should be accountability- but it’s probably the platform as a business entity that should be held responsible (ie getting an actual FCC with teeth again)
Individually ‘commentators’ can claim ‘entertainment’
And I’m not as strident about punishing ‘entertainment’ or ‘satire’
(Should the original War of the Worlds be held liable?)
But they should come with giant warning labels like cigarettes have
I guess to me there is a stark difference between those online who espouse awful opinions - even dangerous ones - and the specific targeting of specific people.

Example-
Liberals are taking over and are indoctrinating our kids!!!

This is nonspecific and lumping all of that group into one and without specific directions doesn't incite anyone to act. It might be heinous and stupid but it isn't a call to arms

There are Haitians immigrants in Springfield, OH eating cat and dogs. They have destroyed that town and they need to be stopped.

To me this is where there are problems. The lie gives a specific marginalized group, in a specific location doing a specific then that must be stopped by true Americans.

This is inciting violence. Full stop, and the person who started the lie should be held liable for the civil damages of their actions. I am not arguing for jail time or criminality mind you,

Costs incurred by the community - in this case: Added police presence at schools, expended community resources etc. should be allowed to be clawed back by the community.

I realize it is an incredibly high bar to clear. As it should be.

Sandy Hook parents were able, after years and millions spent, to get a judgement against Alex Jones. All I am saying is Newtown, CT should have been able to sue him as well.
 
In your opinion, could the same reasoning you displayed in the last paragraph be used for online lies? (I hate the terms misinformation or disinformation - they're lies)

Online lies have incited an insurrection, caused a pizza place in NJ to have armed men show up to stop a nonexistent human trafficking ring that they didnt run out of a basement the building didn't have. And most recently, online lies were amplified so much that LEGAL Haitians immigrants are living in fear and having to close public schools - a constitutional right (14th amendment though not explicit) mind you.

At what point can the case made that online lies do reach the level? How many people need to be radicalized online and act violently before online lies are considered "directed to inciting or producing imminent lawless action and is likely to incite or produce such action”?

I say now is the time to hold people responsible. Otherwise, we are going to be waiting for a tragedy so profound it will force the change. I'd rather not have a bunch of people die for us to realize you should have at least as much responsibility for the words that come off your keyboard as the ones that come out of your mouth.

The answer to “at what point” is when it becomes inciting or likely to incite “imminent” lawless action. Imminent means that the very nature of the message prompts someone to do something right then - it is that compelling in its message. Somebody saying “immigrants eat cats” may at some point cause violence but it certainly isn’t tailored in its content to cause or intend to cause someone who reads the message and feel compelled to violence right then - even though it may be part of someone’s mindset that eventually turns violent.

The problem is that It’s easy to say that a new standard is needed but the devil is in the details. Who decides what is false and intended to cause violence and in a way that will not be commandeered by authoritarian forces? I think that’s the biggest risk: an obvious unintended consequence of criminalizing speech is that the more speech is criminalized, the margins become more easily manipulated.

For example, if a social media message saying we should go to Washington to protest a president’s immigration policy goes viral and creates a real protest that becomes violent, could the government then decide that future such messages are inciting violence and prosecute the next people who post similar messages?

The Bill of Rights is grounded in protecting individuals from government abuse - so that always has to be the starting point. Would changing the standard make it easier for the government to criminalize speech it doesn’t like? And we have seen that we cannot presume the government will act in a just way.
 
The answer to “at what point” is when it becomes inciting or likely to incite “imminent” lawless action. Imminent means that the very nature of the message prompts someone to do something right then - it is that compelling in its message. Somebody saying “immigrants eat cats” may at some point cause violence but it certainly isn’t tailored in its content to cause or intend to cause someone who reads the message and feel compelled to violence right then - even though it may be part of someone’s mindset that eventually turns violent.

The problem is that It’s easy to say that a new standard is needed but the devil is in the details. Who decides what is false and intended to cause violence and in a way that will not be commandeered by authoritarian forces? I think that’s the biggest risk: an obvious unintended consequence of criminalizing speech is that the more speech is criminalized, the margins become more easily manipulated.

For example, if a social media message saying we should go to Washington to protest a president’s immigration policy goes viral and creates a real protest that becomes violent, could the government then decide that future such messages are inciting violence and prosecute the next people who post similar messages?

The Bill of Rights is grounded in protecting individuals from government abuse - so that always has to be the starting point. Would changing the standard make it easier for the government to criminalize speech it doesn’t like? And we have seen that we cannot presume the government will act in a just way.
I actually feel like we have a good balance in place. The part people keep forgetting is private business isn't the government. Any of us can go outside of a government building and start screaming pretty much whatever we like at the government. When you step foot on private property, whether that property be online or physical, those same rules don't apply. The safeguard for this has always been money. If a business wants to play stupid games then they win stupid prizes.

As for Elon, Trump and Twitter allowing foreign governments or even being paid by foreign governments to push their agenda is where I have a problem. I think we are seeing collusion between the Kremlin and the far right. To me that is not a matter of free speech when foreign influence and financial transactions start taking place. So far the links are akin to a ring of fire around the main players with a ton of smoke but I think most can see exactly what is happening. It can't be by chance the Kremlin and these groups keep showing up in the same room and dancing.

Twitter has been different than things in the past. It appears Elon is willing to tank the value of Twitter, chase off all the advertisers and crater Tesla in exchange to dictate policy and politics. When the funding to that is being linked back to enemies of the state, I have a problem with that and question where the line in the sand the Justice Department has a problem with it.

To further complicate things, Elon has positioned himself in a position of need for the government with SpaceX. I have no doubt that the US Government and military view SpaceX as a huge opportunity which is likely allowing him to have a lot more rope. It's really easy to see Elon as the egotistical maniac he is, he is not attempting to hide it. This is all going to blow up at some point. Elon can't help himself and if competition comes along to compete with SpaceX it will likely end in spectacular fashion. Until then, he's going to eventually have to answer to investors of Twitter and Tesla while the government tries to keep him contained. Both companies are in prime position for a full out crash.
 
I actually feel like we have a good balance in place. The part people keep forgetting is private business isn't the government. Any of us can go outside of a government building and start screaming pretty much whatever we like at the government. When you step foot on private property, whether that property be online or physical, those same rules don't apply. The safeguard for this has always been money. If a business wants to play stupid games then they win stupid prizes.

As for Elon, Trump and Twitter allowing foreign governments or even being paid by foreign governments to push their agenda is where I have a problem. I think we are seeing collusion between the Kremlin and the far right. To me that is not a matter of free speech when foreign influence and financial transactions start taking place. So far the links are akin to a ring of fire around the main players with a ton of smoke but I think most can see exactly what is happening. It can't be by chance the Kremlin and these groups keep showing up in the same room and dancing.

Twitter has been different than things in the past. It appears Elon is willing to tank the value of Twitter, chase off all the advertisers and crater Tesla in exchange to dictate policy and politics. When the funding to that is being linked back to enemies of the state, I have a problem with that and question where the line in the sand the Justice Department has a problem with it.

To further complicate things, Elon has positioned himself in a position of need for the government with SpaceX. I have no doubt that the US Government and military view SpaceX as a huge opportunity which is likely allowing him to have a lot more rope. It's really easy to see Elon as the egotistical maniac he is, he is not attempting to hide it. This is all going to blow up at some point. Elon can't help himself and if competition comes along to compete with SpaceX it will likely end in spectacular fashion. Until then, he's going to eventually have to answer to investors of Twitter and Tesla while the government tries to keep him contained. Both companies are in prime position for a full out crash.
Agreed and I think the DOJ has been remiss in pursuing this aggressively (and in recent history, more than remiss).
 

Create an account or login to comment

You must be a member in order to leave a comment

Create account

Create an account on our community. It's easy!

Log in

Already have an account? Log in here.

Users who are viewing this thread

    Back
    Top Bottom