Tumgik
#alexa
beachscapelife · 3 months
Text
Tumblr media
Alexa Pearl
1K notes · View notes
mysharona1987 · 2 years
Text
Tumblr media
Well this isn’t at all scary or deeply weird.
20K notes · View notes
unbfacts · 1 month
Text
Tumblr media
263 notes · View notes
alexademieupdates · 17 days
Text
Tumblr media Tumblr media
ALEXA DEMIE (new) Alexa demie in Inglewood, CA.
10/03/24
158 notes · View notes
kinglw · 2 months
Text
Tumblr media
253 notes · View notes
lisanamjoon · 1 month
Text
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
SICK ALEXA (2024)
157 notes · View notes
Text
Amazon Alexa is a graduate of the Darth Vader MBA
Tumblr media
Next Tuesday (Oct 31) at 10hPT, the Internet Archive is livestreaming my presentation on my recent book, The Internet Con.
Tumblr media
If you own an Alexa, you might enjoy its integration with IFTTT, an easy scripting environment that lets you create your own little voice-controlled apps, like "start my Roomba" or "close the garage door." If so, tough shit, Amazon just nuked IFTTT for Alexa:
https://www.theverge.com/2023/10/25/23931463/ifttt-amazon-alexa-applets-ending-support-integration-automation
Amazon can do this because the Alexa's operating system sits behind a cryptographic lock, and any tool that bypasses that lock is a felony under Section 1201 of the DMCA, punishable by a 5-year prison sentence and a $500,000 fine. That means that it's literally a crime to provide a rival OS that lets users retain functionality that Amazon no longer supports.
This is the proverbial gun on the mantelpiece, a moral hazard and invitation to mischief that tempts Amazon executives to run a bait-and-switch con where they sell you a gadget with five features and then remotely kill-switch two of them. This is prime directive of the Darth Vader MBA: "I am altering the deal. Pray I don't alter it any further."
So many companies got their business-plan at the Darth Vader MBA. The ability to revoke features after the fact means that companies can fuck around, but never find out. Apple sold millions of tracks via iTunes with the promise of letting you stream them to any other device you owned. After a couple years of this, the company caught some heat from the record labels, so they just pushed an update that killed the feature:
https://memex.craphound.com/2004/10/30/apple-to-ipod-owners-eat-shit-and-die-updated/
That gun on the mantelpiece went off all the way back in 2004 and it turns out it was a starter-pistol. Pretty soon, everyone was getting in on the act. If you find an alert on your printer screen demanding that you install a "security update" there's a damned good chance that the "update" is designed to block you from using third-party ink cartridges in a printer that you (sorta) own:
https://www.eff.org/deeplinks/2020/11/ink-stained-wretches-battle-soul-digital-freedom-taking-place-inside-your-printer
Selling your Tesla? Have fun being poor. The upgrades you spent thousands of dollars on go up in a puff of smoke the minute you trade the car into the dealer, annihilating the resale value of your car at the speed of light:
https://pluralistic.net/2022/10/23/how-to-fix-cars-by-breaking-felony-contempt-of-business-model/
Telsa has to detect the ownership transfer first. But once a product is sufficiently cloud-based, they can destroy your property from a distance without any warning or intervention on your part. That's what Adobe did last year, when it literally stole the colors from your Photoshop files, in history's SaaSiest heist caper:
https://pluralistic.net/2022/10/28/fade-to-black/#trust-the-process
And yet, when we hear about remote killswitches in the news, it's most often as part of a PR blitz for their virtues. Russia's invasion of Ukraine kicked off a new genre of these PR pieces, celebrating the fact that a John Deere dealership was able to remotely brick looted tractors that had been removed to Chechnya:
https://pluralistic.net/2022/05/08/about-those-kill-switched-ukrainian-tractors/
Today, Deere's PR minions are pitching search-and-replace versions of this story about Israeli tractors that Hamas is said to have looted, which were also remotely bricked.
But the main use of this remote killswitch isn't confounding war-looters: it's preventing farmers from fixing their own tractors without paying rent to John Deere. An even bigger omission from this narrative is the fact that John Deere is objectively Very Bad At Security, which means that the world's fleet of critical agricultural equipment is one breach away from being rendered permanently inert:
https://pluralistic.net/2021/04/23/reputation-laundry/#deere-john
There are plenty of good and honorable people working at big companies, from Adobe to Apple to Deere to Tesla to Amazon. But those people have to convince their colleagues that they should do the right thing. Those debates weigh the expected gains from scammy, immoral behavior against the expected costs.
Without DMCA 1201, Amazon would have to worry that their decision to revoke IFTTT functionality would motivate customers to seek out alternative software for their Alexas. This is a big deal: once a customer learns how to de-Amazon their Alexa, Amazon might never recapture that customer. Such a switch wouldn't have to come from a scrappy startup or a hacker's DIY solution, either. Take away DMCA 1201 and Walmart could step up, offering an alternative Alexa software stack that let you switch your purchases away from Amazon.
Money talks, bullshit walks. In any boardroom argument about whether to shift value away from customers to the company, a credible argument about how the company will suffer a net loss as a result has a better chance of prevailing than an argument that's just about the ethics of such a course of action:
https://pluralistic.net/2023/07/28/microincentives-and-enshittification/
Inevitably, these killswitches are pitched as a paternalistic tool for protecting customers. An HP rep once told me that they push deceptive security updates to brick third-party ink cartridges so that printer owners aren't tricked into printing out cherished family photos with ink that fades over time. Apple insists that its ability to push iOS updates that revoke functionality is about keeping mobile users safe – not monopolizing repair:
https://pluralistic.net/2023/09/22/vin-locking/#thought-differently
John Deere's killswitches protect you from looters. Adobe's killswitches let them add valuable functionality to their products. Tesla? Well, Tesla at least is refreshingly honest: "We have a killswitch because fuck you, that's why."
These excuses ring hollow because they conspicuously omit the possibility that you could have the benefits without the harms. Like, your tractor could come with a killswitch that you could bypass, meaning you could brick it at a distance, and still fix it yourself. Same with your phone. Software updates that take away functionality you want can be mitigated with the ability to roll back those updates – and by giving users the ability to apply part of a patch, but not the whole patch.
Cloud computing and software as a service are a choice. "Local first" computing is possible, and desirable:
https://pluralistic.net/2023/08/03/there-is-no-cloud/#only-other-peoples-computers
The cheapest rhetorical trick of the tech sector is the "indivisibility gambit" – the idea that these prix-fixe menus could never be served a la carte. Wanna talk to your friends online? Sorry there's just no way to help you do that without spying on you:
https://pluralistic.net/2022/11/08/divisibility/#technognosticism
One important argument over smart-speakers was poisoned by this false dichotomy: the debate about accessibility and IoT gadgets. Every IoT privacy or revocation scandal would provoke blanket statements from technically savvy people like, "No one should ever use one of these." The replies would then swiftly follow: "That's an ableist statement: I rely on my automation because I have a disability and I would otherwise be reliant on a caregiver or have to go without."
But the excluded middle here is: "No one should use one of these because they are killswitched. This is especially bad when a smart speaker is an assistive technology, because those applications are too important to leave up to the whims of giant companies that might brick them or revoke their features due to their own commercial imperatives, callousness, or financial straits."
Like the problem with the "bionic eyes" that Second Sight bricked wasn't that they helped visually impaired people see – it was that they couldn't be operated without the company's ongoing support and consent:
https://spectrum.ieee.org/bionic-eye-obsolete
It's perfectly possible to imagine a bionic eye whose software can be maintained by third parties, whose parts and schematics are widely available. The challenge of making this assistive technology fail gracefully isn't technical – it's commercial.
We're meant to believe that no bionic eye company could survive unless they devise their assistive technology such that it fails catastrophically if the business goes under. But it turns out that a bionic eye company can't survive even if they are allowed to do this.
Even if you believe Milton Friedman's Big Lie that a company is legally obligated to "maximize shareholder value," not even Friedman says that you are legally obligated to maximize companies' shareholder value. The fact that a company can make more money by defrauding you by revoking or bricking the things you buy from them doesn't oblige you to stand up for their right to do this.
Indeed, all of this conduct is arguably illegal, under Section 5 of the FTC Act, which prohibits "unfair and deceptive business practices":
https://pluralistic.net/2023/01/10/the-courage-to-govern/#whos-in-charge
"No one should ever use a smart speaker" lacks nuance. "Anyone who uses a smart speaker should be insulated from unilateral revocations by the manufacturer, both through legal restrictions that bind the manufacturer, and legal rights that empower others to modify our devices to help us," is a much better formulation.
It's only in the land of the Darth Vader MBA that the deal is "take it or leave it." In a good world, we should be able to take the parts that work, and throw away the parts that don't.
(Image: Stock Catalog/https://www.quotecatalog.com, Sam Howzit; CC BY 2.0; modified)
Tumblr media
If you'd like an essay-formatted version of this post to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2023/10/26/hit-with-a-brick/#graceful-failure
286 notes · View notes
femaleidol · 2 months
Text
Tumblr media Tumblr media Tumblr media Tumblr media
ALEXA · BACK IN VOGUE · INKIGAYO 221204
156 notes · View notes
prolificpencomics · 3 months
Text
Tumblr media
voice assistants
186 notes · View notes
melancaotica · 10 months
Text
Eres caos; a veces mi salvación y siempre mi perdición.
445 notes · View notes
artsyebonyrose · 6 months
Text
Tumblr media
inktober day 14: castle
(alexa and liana from barbie diamond castle)
i didn't draw the castle but idc bc its in the name of the film so i'm counting it!
286 notes · View notes
kiimchungha · 2 months
Text
Tumblr media Tumblr media Tumblr media Tumblr media
Alexa The Kelly Clarkson Show
140 notes · View notes
bluegladiatordestiny · 4 months
Text
Alexa
167 notes · View notes
avitha · 1 month
Text
Tumblr media
Two Voices, One Song
Liana and Alexa: Barbie in the Diamond Castle
98 notes · View notes
Photo
Tumblr media
4K notes · View notes
Text
Amazon’s Alexa has been claiming the 2020 election was stolen
The popular voice assistant says the 2020 race was stolen, even as parent company Amazon promotes the tool as a reliable election news source -- foreshadowing a new information battleground
Tumblr media
This is a scary WaPo article by Cat Zakrzewski about how big tech is allowing AI to get information from dubious sources. Consequently, it is contributing to the lies and disinformation that exist in today's current political climate.
Even the normally banal but ubiquitous (and not yet AI supercharged) Alexa is prone to pick up and recite political disinformation. Here are some excerpts from the article [color emphasis added]:
Amid concerns the rise of artificial intelligence will supercharge the spread of misinformation comes a wild fabrication from a more prosaic source: Amazon’s Alexa, which declared that the 2020 presidential election was stolen. Asked about fraud in the race — in which President Biden defeated former president Donald Trump with 306 electoral college votes — the popular voice assistant said it was “stolen by a massive amount of election fraud,” citing Rumble, a video-streaming service favored by conservatives.
The 2020 races were “notorious for many incidents of irregularities and indications pointing to electoral fraud taking place in major metro centers,” according to Alexa, referencing Substack, a subscription newsletter service. Alexa contended that Trump won Pennsylvania, citing “an Alexa answers contributor.”
Multiple investigations into the 2020 election have revealed no evidence of fraud, and Trump faces federal criminal charges connected to his efforts to overturn the election. Yet Alexa disseminates misinformation about the race, even as parent company Amazon promotes the tool as a reliable election news source to more than 70 million estimated users. [...] Developers “often think that they have to give a balanced viewpoint and they do this by alternating between pulling sources from right and left, thinking this is going to give balance,” [Prof. Meredith] Broussard said. “The most popular sources on the left and right vary dramatically in quality.” Such attempts can be fraught. Earlier this week, the media company the Messenger announced a new partnership with AI company Seekr to “eliminate bias” in the news. Yet Seekr’s website characterizes some articles from the pro-Trump news network One America News as “center” and as having “very high” reliability. Meanwhile, several articles from the Associated Press were rated “very low.” [...] Yet despite a growing clamor in Congress to respond to the threat AI poses to elections, much of the attention has fixated on deepfakes. However, [attorney Jacob] Glick warned Alexa and AI-powered systems could “potentially double down on the damage that’s been done.” “If you have AI models drawing from an internet that is filled with platforms that don’t care about the preservation of democracy … you’re going to get information that includes really dangerous undercurrents,” he said. [color emphasis added]
165 notes · View notes