Connect with us

Technology & Innovation

Top shows and movies by hours watched

Diane Davis

Published

on

Netflix is retooling its metrics system to reflect a critical measure of any show’s success in this 21st-century streaming age: its binge-watchability.

While the entertainment giant’s metrics previously focused on the number of unique accounts that viewed any given title, it’s now shifting to report the total, collective number of hours spent viewing each title, as alluded to in an October letter to shareholders. Along with this new methodology, Netflix is also launching a score-keeping website, Top10.Netflix.com, which will post a running chart of the most-watched series and films—both English and foreign-language—to be updated weekly.

Netflix, which historically kept its viewership data under tight wraps, has become more forthcoming about numbers in recent years as it seeks to lure top talent to its production studios, although it still drew criticism every step of the way. Its prior method of tracking view counts, which it employed for the past two years, defined a “view” as any time a subscriber streamed at least 2 minutes of a show or movie—barely enough time to scratch the surface of the opening credits. Critics slammed the method as both hilariously inaccurate and a thinly-veiled ploy to boost watch counts, and Netflix appears to finally be addressing those complaints with its metric overhaul.

“Figuring out how best to measure success in streaming is hard,” Netflix said in a blog post, and there’s no perfect system. “Having looked at the different options, we believe engagement as measured by hours viewed is a strong indicator of a title’s popularity, as well as overall member satisfaction, which is important for retention in subscription services,” wrote the company’s vice president of content strategy, Pablo Perez De Rosso. “Hours viewed mirrors the way third parties measure popularity” and also allows the company to factor in repeat-watches, which Perez De Rosso calls a “strong sign of member joy.” It does, however, favor longer films and series, as the company acknowledges.

It hopes to offset that slant by occasionally publishing specialty lists, “for example, top documentary features or reality shows, which our members love but may appear less prominently” on Top10.Netflix.com—a website the company bills as the place to go to contextualize your friend’s newfound passion for chess (ahem, Queen’s Gambit) or “what’s going on with all those ‘cool cats and kittens’” (their words, not ours). The site, which goes live tomorrow, currently lists Jumanji: The Next Level, Transformers: The Last Knight, and Red Notice among its most-streamed movies, with the latter racking up nearly 150 million hours of viewtime in the last week alone. Squid Game and Arcane, an animated fantasy series set in the League of Legends universe, were among the top shows. The former, a South Korean cult hit, totaled nearly 43 million hours last week. According to Netflix in October, that series was watched by 142 million households—two-thirds of its 214 billion global subscriber base—in the month after its release, smashing company records.

Per Nielsen estimates, Netflix accounted for 6% of total TV-watching time in the United States in September. Of its competitors, YouTube accounted for 6%, Hulu for 3%, Disney+ for 1%, and a bucket category dubbed “other streaming” for 9%.


Continue Reading

Technology & Innovation

What Luddites can teach us about resisting an automated future

Diane Davis

Published

on

A story in comic format. In this first panel, two figures in silhouette look out at a modern city skyline.  The text reads, "The future is here already. AI art has arrived. Simply write a prompt..."

A person's smiling headshot being uploaded.  The text reads, "Or allow a company unrestricted access to your likeness..."
The headshot from the previous panel with distroted features and a wavy new background. The text reads, "...And ta-da! Some absolutely serviceable hotel lobby art. But calling this 'AI art' uncritically buys into the AI hype machine."
Two people look at the blank space where the framed picture of a flower has been stolen by a giant robot hand. The text reads, "Algorithmically generated art theft" might be more accurate.

Two panels. In the first, a scrabbly line resembling a signature. The text reads, "Machine-learning software is "taught" by feeding it existing art without consent, credit, or compensation. To the extent that mangled signature remnants remain visible on AI art." In the second, the text reads, "Artists have begun to push back." over the image of protestors led by Matthew Butterick, co-counsel in a class action lawsuit brought against AI art by artists, who says, "Everybody would creates for a living should be in code red."
The text reads, "A common refrain from defenders of AI art has been to label these naysayers:"  above multiple speech bubbles saying "Luddite!" "Luddites!" and "Luddism!"
Text across the top continues, "...a term synonymous with technophobia, anti-progress, and reactionism. It's even used to describe being hapless with new tech." A group of people eating dinner together, and one person is saying, "Oh, I can't use TikTok! I'm such a Luddite!" The below the image reads, "In truth, the Luddites were skilled with machines. They were simply fighting for better worker rights."

A man in a cravat holding a plume at his writing desk smiles, while behind in shadow are two women at textile machines. The text reads, "In 1799, the British government passed legislation that prohibited trader unions and collective bargaining. Mill owners introduced more machines in their factories, reducing wages."
A man with his shirtsleeves rolled up, raising a sledgehammer.  The text reads, "And so, in 1811, after years of frustrating negotiations, a spate of coordinated attacks on mill frameworks erupted across the United Kingdom, led by "King Ned Ludd."

Sheets of paper flutter down to a gallows in the foreground as soldiers with bayonets raised corral a crowd of civilians behind.  The text reads, "But them, 14000 soldiers were sent to protect the mills and quash riots. New legislation made frame breaking a capital offense. Eventually, key Luddite organizers were identified, arrested and executed.
Lord Byron reciting from "Song for the Luddites" says with his eyes closed, "This is the dew, which the tree shall renew Of Liberty, planted by Ludd!"  The text above reads, "Yet the movement lived on. Outbreaks of machine breaking continued for years in the UK and beyond.
A woman smashing something with a sledgehammer. The text reads, "In 1837, a spinning jenny in Chalabre, France was destroyed by workers. The women workers 'made themselves conspicuous by their fury and violence.' From local newspaper, L'Aude."

The text reads, "In the 1870s, English textile designer, author, and socialist William Morris began taking an interest in the production process of his designs. He was disgusted by the poor living conditions of workers and the pollution caused by the textile industry." William Morris in profile saying, "Why does a reasonable man use a machine? Surely to save his labour. Under capitalism, machines were primarily used to increase production, thereby increasing the worker's drudgery."
The text reads, "People have been making similar arguments against AI art." A tweet from CatBastardQuinn, @QuinnCat13 reads, "We could automate menial jobs so people have time to make art and music but apparently we'd rather automate art and music so people have time for menial jobs."
An office worker at a laptop with face in hands and an Amazon worker in a pile of shipping boxes. The text reads" Likewise, Morris was interested in what he called 'worthy work'.' He wanted people to take pleasure in their work rather than 'mere toiling to live that we may live to toil.'"
The text reads, "He understood that machines were only as progressive as the people who used them." Morris facing us from a field of app notifications, says "[An unfree man is a] slave to machinery; the new machine MUST be invented, and when invented he MUST - I will not say use it, but be used by it whether he likes it or not."
Frederick Taylor overseeing factory workers standing around machines with a stopwatch.  The text reads, "In America, mechanical engineer Frederick W. Taylor began using what he called the 'Scientific Management' in his factories. Taylor timed each worker's every movement, breaking down their work into a set of discrete tasks. Then demanded workers speed them up."
The text reads, "Taylorism, as it became known, was less a science than a political ideology concerned with remolding workers into pliant subjects. Taylor faces us to say, "A complete mental revolution on the part of the workingman...toward their work, toward their fellow men, and toward their employers."
The text reads, "But the spirit of Luddism lived on. Seemingly out of nowhere, there was a rash of mechanical breakdowns at Taylor's factories." An angry Taylor tells us, "These men were deliberately breaking their machines." while two men stand behind him shrugging and smiling slightly while a plume a smoke rises from an unseen area.

A cover of a  book entitled, "Sabotage: Its History, Philosophy & Function by author Walker C Smith. shows a black cat climbing atop a bag of money.  The text reads, "Despite setbacks and intervention from Congress, Taylorism spread. The Industrial Workers of the World responded by publishing two tracts on the topic of sabotage in 1913. 'The aim is to hit the employer in his vital spot, his heart and soul, in other words, his pocketbook.;"
Hands reach to an assembly line where the words "Automation - as it was later dubbed by Delmar Harder, vice president of Ford Motor Company, in 1947 - continued unabated." repeat on each panel of the conveyor.

The text reads, "The legacy of the Luddites lived on in the latter half of the 20th century. Confined to the lowest-paying jobs, black workers were the first to be targeted by the midcentury push for automation.  Robert L. Allen quotes from his book, Black Awakening in Capitalist America (1969) "Not only is the economic situation of the masses of blacks grim, but the prospects are that it will not improve, rather it will deteriorate. This is due partly to the unregulated impact of automation."
The text reads, "The Black Panthers were also quick to recognize <a href=technology and automation were not politically neutral." Huey Newton, cofounder of the Black Panther Party says from a podium, "If the ruling circle remains in power it seems to me that capitalists will continue to develop their technological machinery because they are not interested in the people…Every worker is in jeopardy because of the ruling circle."” class=”wp-image-1088535″ />
A group of Black Panthers standing together.  The text reads, "The Black Panther Party's Ten Point Program was updated in 1972. 'People's community control of modern <a href=technology‘ was added to the demands for ‘land, bread, housing, education, clothing, justice, peace."” class=”wp-image-1088536″ />

A conveyor belt with a chart, titled "Overrepresentation of African Americans in 3 occupation categories with the highest expected displacement, %" The statistics are Office Support e.g. secretaries with a displacement rate of 36%, Food Services, e.g. fast-food cooks with a displacement rate of 35% and Production work e.g. machinists with a displacement rate of 34%. The text reads, "A 2019 McKinsey Global Intitute report found that: 'African Americans are overrepresented in occupations likely to be the most affected by automation.' And the 2030 outlook doesn't look much better."
A crowd of students holding a sign that says "STRIKE" and a button that says, "I am a human being: do not fold, spindle, or mutilate" The text reads, "The introduction of the computer in the 1970s only exacerbated growing tensions. Punch cards used in universities, similar to cards used for drafting recruits for Vietnam, were seen as a symbol of bureaucracy and alienation. Students burned, vandalized, and otherwise destroyed punch cards for course registration."
Lisa Gitelman, author of Always Already New: Media, History and the Data of Culture (2008) says, "Many people simply stopped drawing distinctions between one card-enabled system and another. Whether the cards registered draftees or pupils, they helped 'the system.""
Men in a warroom facing a map. The text reads, "Vietnam had become the first computational war, bringing a shift toward strategies rooted in quantitative data collection and automated analysis."

A fighter plane with the name, "Tom Cat" on the side. The text reads, "Sensor arrays and unmanned drones became a big part of the war. An ongoing experiment with the intention of the eventual replacement of human pilots. American soldiers were sabotaging equipment, staging protests and refusing to fight. Automation of war as with the automation of industry, was a political strategy designed to, once again, reassert control over rebellious workers."
A person at the self-checkout frustrated by the machines error message which says, "Unexpected item in the bagging area!"  The text reads "Now, as more of our daily lives are automated, people are finding that it doesn't always make our lives easier."
The text reads, "In 2016, a study revealed that physicians spend two hours of computer work for every hour spent speaking with a patient face to face." Atul Gawande, surgeon and public health researcher tells us, "I've come to feel that a system that promised to increase my mastery over my work has, instead, increased my work's mastery over me."

The text above people using computers reads, "In the rush to claim success with new AI software, this invisible human work becomes even more insidious. Researcher Jathan Sadowski calls it 'Potemkin AI.'  Sadowski says to us, "There is a long list of services that purport to be powered by sophisticated software, but actually rely on humans acting like robots."
Writer and filmmaker Astra Taylor calls these modern-day Mechanical Turks an example of "fauxtomation" which she says "reinforces the idea that work has no value if it is unpaid and acclimates us to the idea that one day we won't be needed."
A sad person hold s a robot mask to their face. The text reads, "As a 2019 report by the think tank Data & Society concludes, 'Automated and AI technologies tend to mask the human labour that allows them to be fully integrated into a social context while profoundly changing the conditions and quality of labour that is at stake."

The text reads, "Opposition to 21st century tech can be found in unlikely places. Silicon Valley executives are restricting their own children's screen time and sending them to tech-free schools." Taewoo Kim, chief AI engineer at machine learning startup One Smart Lab, tells us from in front of a Waldorf school, "You can't put your face in a device and expect to develop a long-term attention span."
A bullet shaped robot outside near grass has sauce on it. The text reads, "In San Francisco, security robots sent to harass the homeless have been repeatedly assaulted, with one being covered in BBQ sauce and wrapped in tarp."
An autonomous vehicle with a broken window.  The text reads, "In Arizona, people are slashing the tires of driverless cars after Uber struck and killed a woman in Tempe." A quote from Douglas Rushkoff, author of Team Human (2019) reads, "People are lashing out justifiably. There's a growing sense that the giant corporations honing driverless technologies do not have our best interests at heart."
A pie chart with the words, "A pew research poll found that 85% of Americans favored the restriction of automation to only the most dangerous forms of work."

People standing around and looking at their cell phones. The text reads, "We're all living through an era in which we have become the product."  Shoshana Zuboff, author of The Age of Surveillance Capitalism (2018) tells us, "The age of surveillance capitalism is a titanic struggle between capital and each one of us. It is a direct intervention into free will, an assault on human autonomy. To tune and herd and shape and push us in the direction that creates the highest probability of their business success. [There's no way] to dress this up as anything but behavioral modification."
A facial recognition target over the face of a Black person next to a hand pointing at the word "Error." The text reads, "Many are also drawing attention to the biases of software made by an almost entirely male, predominantly white workforce."  A quote from Timnit Gebru, cofounder of Black in AI says, "I'm worried about groupthink, insularity, and arrogance in the AI community, [...] If many are actively excluded from its creation, this <a href=technology will benefit a few while harming a great many."” class=”wp-image-1088552″ />
The text reads, "VPNs, the dark web, and plugins like RequestPolicy are arguably Luddite responses to new <a href=technology. Computer science students have already developed Glaze- a tool to prevent AI models from mimicking artist styles." Maxigas, in Resistance to the Current: The Dialectics of Hacking (2022) says, "A retrograde attempt to rewind web history: a Luddite machine that, as they say ‘breaks’ the essential mechanisms of websites."” class=”wp-image-1088553″ />

Gavin Mueller in Breaking Things at Work (2021) says, "Luddism contains a critical perspective on <a href=technology that pays particular attention to technology‘s relationship to the labor process and working conditions. In other words, it views technology not as neutral but as a site of struggle. Luddism rejects production for production’s sake: it is critical of ‘efficiency’ as an end goal."” class=”wp-image-1088554″ />
Above a person tearing a line chart, the text reads, "'Degrowth,' 'slow living,' 'quiet quitting,' and the 'I Do Not Dream of Labour' movements could all be described as forms of modern neo-Luddism."  Jason Hickel, author of Less is More: How Degrowth will Save the World (2020)  says, "Lashed to the growth imperative, <a href=technology is used not to do the same amount of stuff in less time, but rather to do more stuff in the same amount of time."” class=”wp-image-1088555″ />

A man with a sledgehammer resting on his shoulder as hi looks out at the horizon. The text reads, "Questioning and resisting the worst excesses of <a href=technology isn’t antithetical to progress. If your concept of ‘progress’ doesn’t put people at the center of it, is it even progress? Maybe those of us who are apprehensive about AI art are Luddites. Maybe we should wear that badge with pride. Welcome to the future. Sabotage it."” class=”wp-image-1088556″ />

Tom Humberstone is a comic artist and illustrator based in Edinburgh, Scotland.

Continue Reading

Technology & Innovation

China’s next cultural export could be TikTok-style short soap operas

Diane Davis

Published

on

Poster of the short drama "Mr. Williams! Madame Is Dying," showing the two protaganists.

Web novels are a unique form of literature that has been popular on the Chinese internet for much of the last two decades: long stories that are written and posted chapter by chapter every day. Each chapter can be read in less than 10 minutes, but installments will keep being added for months if not years. Readers become avid fans, waiting for the new chapter to come out every day and paying a few cents to access it.

While some talented Chinese book authors got their big break by writing web novels, the majority of these works are the popcorn of literature, offering daily bite-size dopamine hits. For a while in the 2010s, some found an audience overseas too, with Chinese companies setting up websites to translate web novels into English.

But in the age of TikTok, long text posts have become less popular online, and the web-novel industry is looking to pivot. Business executives have realized they can adapt these novels into super-short dramas. Both forms aim for the same market: people who want something quick to kill time in their commute, or during breaks and lunch.

Many of the leading Chinese short-drama apps today work closely with Chinese web-novel companies. ReelShort is partially owned by COL Group, one of the largest digital publishers in China, with a treasure trove of novels that are ready for adaptation.

Poster of the short drama Mr. Williams! Madame Is Dying.

COURTESY OF FLEXTV

To get a quick sense of what these stories are like, you just need to take a look at their titles: President’s Sexy Wife, The Bride of the Wolf King, Boss Behind the Scenes Is My Husband, or The New Rich Family Grudge.

One of the highest-grossing shows on FlexTV is called Mr. Williams! Madame Is Dying. It’s a corny romance story about a love triangle, ultra-rich families, cancer, rebirth, and redemption, and it was adapted from a Chinese web novel that has nearly 1,300 chapters. The original story has been turned into a Chinese short drama, but FlexTV decided to shoot another version in Los Angeles for an international audience.

These short dramas prioritize quick, oversimplified stories of love, wealth, betrayal, and revenge, sometimes featuring mythical creatures like vampires and werewolves. Stories of marrying into a rich family attract men, while stories with a powerful female protagonist in control of her life appeal to women, says Gao, the COO of FlexTV. 

“Quibi mostly served the [artistic] pursuits of directors and producers. They thought their tastes were better than the general public and their work was to be appreciated by the elites,” he says, “What we are making is more like fast-moving consumer goods. It’s rooted in the needs of ordinary users.”

Continue Reading

Technology & Innovation

Algorithms are everywhere | MIT Technology Review

Diane Davis

Published

on

cover of Algorithms for the People

“The present is not a prison sentence, but merely our current snapshot,” they write. “We don’t have to use unethical or opaque algorithmic decision systems, even in contexts where their use may be technically feasible. Ads based on mass surveillance are not necessary elements of our society. We don’t need to build systems that learn the stratifications of the past and present and reinforce them in the future. Privacy is not dead because of technology; it’s not true that the only way to support journalism or book writing or any craft that matters to you is spying on you to service ads. There are alternatives.” 

A pressing need for regulation

If Wiggins and Jones’s goal was to reveal the intellectual tradition that underlies today’s algorithmic systems, including “the persistent role of data in rearranging power,” Josh Simons is more interested in how algorithmic power is exercised in a democracy and, more specifically, how we might go about regulating the corporations and institutions that wield it.

PRINCETON UNIVERSITY PRESS

Currently a research fellow in political theory at Harvard, Simons has a unique background. Not only did he work for four years at Facebook, where he was a founding member of what became the Responsible AI team, but he previously served as a policy advisor for the Labour Party in the UK Parliament. 

In Algorithms for the People: Democracy in the Age of AI, Simons builds on the seminal work of authors like Cathy O’Neil, Safiya Noble, and Shoshana Zuboff to argue that algorithmic prediction is inherently political. “My aim is to explore how to make democracy work in the coming age of machine learning,” he writes. “Our future will be determined not by the nature of machine learning itself—machine learning models simply do what we tell them to do—but by our commitment to regulation that ensures that machine learning strengthens the foundations of democracy.”

Much of the first half of the book is dedicated to revealing all the ways we continue to misunderstand the nature of machine learning, and how its use can profoundly undermine democracy. And what if a “thriving democracy”—a term Simons uses throughout the book but never defines—isn’t always compatible with algorithmic governance? Well, it’s a question he never really addresses. 

Whether these are blind spots or Simons simply believes that algorithmic prediction is, and will remain, an inevitable part of our lives, the lack of clarity doesn’t do the book any favors. While he’s on much firmer ground when explaining how machine learning works and deconstructing the systems behind Google’s PageRank and Facebook’s Feed, there remain omissions that don’t inspire confidence. For instance, it takes an uncomfortably long time for Simons to even acknowledge one of the key motivations behind the design of the PageRank and Feed algorithms: profit. Not something to overlook if you want to develop an effective regulatory framework. 

“The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.”

Much of what’s discussed in the latter half of the book will be familiar to anyone following the news around platform and internet regulation (hint: that we should be treating providers more like public utilities). And while Simons has some creative and intelligent ideas, I suspect even the most ardent policy wonks will come away feeling a bit demoralized given the current state of politics in the United States. 

In the end, the most hopeful message these books offer is embedded in the nature of algorithms themselves. In Filterworld, Chayka includes a quote from the late, great anthropologist David Graeber: “The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.” It’s a sentiment echoed in all three books—maybe minus the “easily” bit. 

Algorithms may entrench our biases, homogenize and flatten culture, and exploit and suppress the vulnerable and marginalized. But these aren’t completely inscrutable systems or inevitable outcomes. They can do the opposite, too. Look closely at any machine-learning algorithm and you’ll inevitably find people—people making choices about which data to gather and how to weigh it, choices about design and target variables. And, yes, even choices about whether to use them at all. As long as algorithms are something humans make, we can also choose to make them differently. 

Bryan Gardiner is a writer based in Oakland, California.

Continue Reading

Trending