viernes, 27 de diciembre de 2019

Ciberseguridad

Hoy, leyendo a Manolo, puso algo sobre el tema, temo mucho que el politico promedio no tenga la menor idea y crea que el sobrino de su prima que arregla las maquinas posta y baja peliculas es lo necesario

NO

pero, de todas formas, lo poco o mucho que conozco del estado argentino me desalienta, en fin

esto es interesante


Huxley

Siguiendo el espiritu del post anterior? me acorde de esto de lo solidario,  si hay que serlo, seamoslo, maestros, judiciales politicos etc etc todos a regimen general y que hagan juicio nomas


82%

Hoy tuve una discusion interesante donde una persona que respeto mucho porfiaba que el 82% de los jubilados era un invento del caso Badaro de 2009, bah, una creacion de los malos.
Resulto ser que el 82% viene al menos de 1958 de la ley 144499 en su art 2

ARTICULO 2° - El haber de la jubilación ordinaria será equivalente al 82 % móvil, de la remuneración mensual asignada al cargo, oficio o función de que fuere titular el afiliado, a la fecha de la cesación en el servicio o al momento de serle otorgada la prestación, o bien al cargo, oficio o función de mayor jerarquía que hubiese desempeñado.
A este efecto se requerirá haber cumplido en el cargo, oficio o función, un período mínimo de doce meses consecutivos. Si este período fuere menor o si aquéllos no guardaren una adecuada relación con la jerarquía de los desempeñados por el agente en su carrera, se promediarán los que hubiese ocupado durante los 3 años inmediatamente anteriores a la cesación de servicios.
Entiéndese por remuneración la asignación fijada por el presupuesto o los convenios colectivos de trabajo, más los suplementos adicionales, cualquiera fuere su concepto, siempre que tengan carácter de habituales, regulares y permanentes.
Para los casos de remuneraciones establecidas sobre la base de comisiones, el haber jubilatorio será determinado por el promedio de los doce meses consecutivos más favorables, por los cuales se hubiera aportado a la caja respectiva, y la actualización de las prestaciones se efectuará anualmente mediante la aplicación de los coeficientes en razón del índice del costo de vida, obtenido por la Dirección Nacional de Estadística y Censos.
Esta movilidad no modifica el régimen de prestaciones establecido por sistemas más favorables al afiliado.
Quedan excluidos de estos aumentos los legisladores, mientras dure el actual ejercicio de su mandato.

lo que es Internet no, eso si, darme la razon, ni en pedo

jueves, 19 de diciembre de 2019

Economist

Buscando esta nota llegue al post anterior, pero, siguiendo con la costumbre afanar notas

The World The Economist Made
How a 170-year-old magazine has struggled to uphold liberal capitalism.

Warnings of a crisis of liberalism have become commonplace, as it is assailed by an illiberal right on the one side and a socialist left on the other. The situation is plain, and it is at least partially self-inflicted. Liberalism has missed opportunity after opportunity. It is not merely the tepid response to the financial crisis. One critic warns that already in the ’80s there were “fresh and full disclosures of poverty” and “the decay of rural industry and population,” yet liberals made no serious proposals for reform. “The old laissez faire individualism,” he reasons, “was still too dominant.” Nevertheless, he holds, there is the possibility of renewal. The cause of liberty can be advanced through social reform, through a more robust welfare state, through attacks on monopolies and unearned property. The writer is the social scientist J.A. Hobson. The year is 1909.




Today, Hobson is best remembered for his 1902 work Imperialism, the major thesis of which was cribbed by Lenin. (They agreed that the drive for empire was caused by the search for profits under capitalism.) Yet the parallels between Hobson’s analysis and the widespread “crisis of liberalism” literature of our own times are striking. Hobson’s ’80s were not the Reagan years but the 1880s, of course, but his complaints about laissez-faire individualism find their echoes in today’s critiques of neoliberalism. Even his eventual refuge feels contemporary: After trying to convince liberals to include more socially protective programs, Hobson eventually gave up and became a socialist.


That this 110-year-old complaint seems to speak to our own times is a testament both to the longevity of liberalism and to the difficulty—perhaps impossibility—of resolving its tensions. In his book Liberalism at Large, Alexander Zevin seeks to trace this history through a study of a single publication: The Economist, which still describes itself as a newspaper rather than a magazine. Published continuously since 1843, The Economist is deeply identified with the liberal project, and has long proved influential throughout the world. Its readers have ranged from Karl Marx to Benito Mussolini, from Franklin Roosevelt to Angela Merkel. Even today, when it is harder and harder to make journalism pay, it remains profitable and widely circulated, whistling past the graveyard of many a mangled media property. It has more than a million subscribers, despite being the most expensive weekly in the United States. Unlike most magazines, it de-emphasizes individual authors, only occasionally posting bylines and relying instead on a strong institutional voice and a dry wit.




Zevin, who is an editor at the New Left Review, regards The Economist as one imagines an antelope might regard a crocodile—impressed by its longevity and power, suspicious of its habitat, and wary of its bite. Parts of the liberal program have been incorporated across the political spectrum: As Zevin describes it, liberalism “combined economic freedoms—the right to unconditional private property; low taxes; no internal tariffs; external free trade—with political freedoms: the rule of law; civil equality; freedom of the press and assembly; careers ‘open to talent’; responsible government.” But it has also been closely associated, at least in the dominant version represented by The Economist, with financial power. Zevin argues that The Economist’s unabashed pro-market position and its close relationship to the City of London (the United Kingdom’s equivalent of Wall Street) explain its influence, its importance, and its blind spots. For over 150 years, he writes, The Economist has been “offering up the sort of political advice that markets themselves might, if only they could speak.”
For Zevin, this is the core of the problem. The bankers and businesspeople who read it are treated to capsule summaries of world affairs and charts, presented with an air of knowing superiority. A senior editor once told a nervous new recruit that to write like The Economist, you just “pretend you are God.” But markets are human institutions, not divine ones. As one writer for the paper put it, The Economist was the place where you could “hear the bourgeoisie talking to itself, and it could talk quite frankly.” What Zevin finds is a frequently unreflective conversation. His Economist is an institution that has too often been unable to confront, or resolve, liberalism’s troubled relationships with democracy, empire, and the expanding power of finance itself. 

The Economist was born out of the British economic crisis of 1837. That financial panic produced both the Chartists, a primarily working-class movement for universal male suffrage, and the Anti-Corn Law League, led by the manufacturing middle class. The Corn Laws imposed tariffs on foreign wheat, raising the price of bread to protect domestic agriculture. The Anti-Corn Law League demanded their repeal and their substitution with a policy of free trade. It was this campaign that The Economist was created to support.

Its founder, James Wilson, was born in 1805 to a wealthy Quaker family. As a young man, he received a substantial gift from his father: 2,000 pounds, equivalent to at least $250,000 in 2019. But in 1837 he lost most of his accumulated wealth by betting on the price of indigo. Removing volatility in the trade cycle would have saved him a great deal of money. And that would be part of what free trade could do, he reasoned in an 1839 pamphlet. Trade restrictions were unnatural, and their removal would benefit all classes. He became a regular speaker at Anti-Corn Law events, known for using statistics to make his case, and political doors began to open for him. In 1843, he established The Economist (the word’s meaning at the time referred to thriftiness, rather than professional standing in the field of economics). He sought readers among Corn Law opponents, and by the 1850s his newspaper was read in influential circles.


Wilson set the template for many of The Economist’s future editors. Engaged with the issues of their time, they wrote books and pamphlets alongside their contributions to the publication. They were politically connected—Wilson himself was elected to Parliament in 1847. He was also stubbornly committed to principle, identifying the cause of capital with the cause of all humankind. His newspaper opposed laws that would have limited workday hours, arguing that the interests of workers and employers were identical. There was no productive role for class conflict in Wilson’s worldview, nor indeed for public education, charity schools, or town sanitation. “If the pursuit of self-interest, left equally free for all, does not lead to the general welfare,” the paper declared, “no system of government can accomplish it.”

Wilson, who died of dysentery in 1860, is less remembered than his son-in-law and successor as Economist editor, Walter Bagehot. Zevin does not share a sentimental view of this prolific writer and editor as the “greatest Victorian” (as the historian Jacques Barzun described him). More pragmatic than Wilson, Bagehot did favor a permanent graduated income tax and, in his 1873 book Lombard Street, the idea of a central bank as a lender of last resort. But his love of finance clashed with democratic demands. He wanted a government that was maximally compatible with the needs of finance. (When he was feeling sad, Bagehot is reported to have made a habit of going to the bank to run his hands through heaps of coins.) The thesis of his book The English Constitution, written in 1867, was that the British government worked not because of the separation of powers, but because the real work of governing was done by the Cabinet while the monarchy put on a performance of governing to please what he called “the vacant many.” “Every person has a right to so much power,” wrote Bagehot, “as he can exercise without impeding any other person who would more fitly exercise such power”—a view far more aristocratic than libertarian. “In all cases it must be remembered,” he declared, “that a political combination of the lower classes as such and for their own objects, is an evil of the first magnitude.”


Bagehot’s belief in the British ruling class’s special fitness also extended to questions of empire. Both Wilson and Bagehot were broadly believers in liberal empire, and Wilson approved, for example, of the opening of China by violence. (In 1857, the paper wrote, “We may regret war … but we cannot deny that great advances have followed in its wake.”) Bagehot, for his part, thought the British the “most enterprising, the most successful, and in most respects the best, colonists on the face of the earth.” The Economist celebrated British imperialism above others, assuming its good intentions and that it worked to promote trade. But it rarely criticized: It stayed silent on the discovery of British-run concentration camps in the Second Boer War, for example. Zevin argues that The Economist’s pro-finance position necessarily made it a cheerleader for empire, since empire was the framework within which wealth was being created.


But liberalism has many currents, and in response to imperial abuses and demands for social reform, in the early twentieth century The Economist entered a new period. Its editorial line acted as a kind of barometer of liberal conventional wisdom, responding to the atmospheric pressure of world events. Editor Francis Hirst, who took the top job in 1907, held, much like Hobson, that the scramble for Africa was the result of “financial imperialism,” rather than the means to pass on the supposed blessings of civilization. Hirst condemned military aggression, arguing that reducing spending on arms was the only way to carry out needed social reforms while keeping taxes low. Perhaps Hirst’s statement that the British Constitution was “only a mask over the face of plutocracy” was not so analytically distinct from Bagehot, but Hirst at least meant it as a criticism.


Hirst was ultimately let go for his opposition to World War I, but the publication continued to reflect the rising influence of the Labour Party, and the more radical demands of its time. Many students of Keynes wrote for the paper during the Great Depression, and Douglas Jay, later a Labour member of Parliament, even wrote a book called The Socialist Case in 1937 while on staff. The Economist hardly abandoned its free-trade principles: It felt that, under the circumstances, it was essential to help ensure that the Labour Party understood the needs of the bankers, and how they might be properly integrated into a mixed economy. The Economist even endorsed the Beveridge Report in 1942, which laid the foundations for the postwar welfare state and the National Health Service in Britain. It was in this period, from 1938 to 1956, under the editorship of Geoffrey Crowther, that the readership expanded significantly, and the paper developed its signature style: witty and pragmatic, staking out what Crowther described as the “extreme center.”



As postwar turned to Cold War, The Economist swung back toward positions that resembled its early years. Defending the liberal project now became synonymous with defending American empire, and attacking socialism wherever it occurred. The paper cheered the Marshall Plan and Truman’s commitment to an anti-communism in Greece, Turkey, and Korea. Espionage entered the newsroom. Notorious Soviet spy Kim Philby served as the Middle East correspondent for a time, where he turned in blandly anti-communist articles to maintain his cover. East Asia correspondent Brian Crozier also took over the bulletin known as the Foreign Report, where he regularly reprinted content from the CIA and the U.K.’s own propaganda outfit, the Information Research Department. The magazine followed U.S. propaganda closely in places like Vietnam and Chile. The Chile correspondent danced down the hallways of The Economist shouting “My enemy is dead!” when news of the coup against socialist Salvador Allende reached him in 1973.

With the end of the Cold War, the newspaper has cheered and defended globalization and the power of unbridled markets. Norman Macrae, a longtime deputy editor, claimed to have coined the term “privatization.” After the protests accompanying the World Trade Organization meetings in Seattle in 1999, the newspaper put an Indian girl wearing a shawl on the cover, making her the face of “The real losers from Seattle.” If anti-globalization protesters got their way, it reasoned, she would lose her path out of poverty. Meanwhile, it mocked activists for having the temerity to protest in the city of globalized companies such as Microsoft and Boeing. Capitalism was still the only engine of wealth creation around, and those pointing out the damage that the late-twentieth-century version of globalization was doing to democracy or to industrial laborers were to be chided rather than taken seriously.


After the financial crisis of 2008, the paper called for minor adjustments to the global economic system rather than wholesale rethinking. “All the signs are pointing in the same direction,” it wrote: “a larger role for the state, and a smaller and more constrained private sector. This newspaper hopes profoundly that this will not happen.” Through all its changes, and over more than a century and a half, The Economist’s enthusiasm for free trade remains undiminished.



As a book, Liberalism at Large juggles several tasks. It is an institutional history of the paper, exploring the lives and personalities of its writers and editors. It is an account of the positions The Economist staked out on the major issues of the day. It is an argument that The Economist’s connections to power have made it a participant in the building of the liberal project rather than just a chronicler. And it is a critique of the paper’s way of looking at the world. There is so much to do that the book can feel simultaneously rushed and overlong.


Zevin’s critique, pronounced atop a formidable pile of research, is forceful and serious: The Economist, as the voice of finance capital, is too self-satisfied to see its own role in the world’s troubles. “Averting their gaze,” concludes Zevin, “liberals have scratched their heads at the political volatility of the present, unable to recognize their handiwork.” Placing the needs of capital above those of democracy has led to backlash; the paper may never have met an act of liberal imperialism of which it did not approve; and financialized capitalism has grown out of control. The paper may disapprove of Trump and Brexit (it endorsed Clinton in 2016 and favors remaining in the European Union), but backlash to the kind of globalization it has championed is one of the reasons that we now have to live with both. The Economist may be witty, it may be contrarian, it may be informative, but it is also implicated in many current problems. When markets speak for themselves, it turns out, they lack a culture of self-criticism. 


Underlying Zevin’s critique is the hope that, from the failures of liberal capitalism, a more just and egalitarian alternative will emerge. But the history relayed in the book does not necessarily align with those hopes. Zevin sees The Economist as representing the “dominant” strain of liberalism: the one that both gets to act upon the world and has to adapt to it. In the early twentieth century, when the paper grew closer to Labour and advocated serious expansions of the welfare state, it did so not out of charity but of necessity. Always eager to protect market economies, it saw that reform was essential to maintain popular support for the system it holds responsible for wealth creation.


If this reading of the newspaper’s trajectory is correct, then what is to be thought of the fact that The Economist has offered so few concessions after the financial crisis of 2008? In the wake of the financial crisis, it has made the rhetorical switch from defending “capitalism” to “liberalism.” But it can still report on the idea of a global wealth tax, for example, without feeling any need to endorse it. If Zevin is correct to read the magazine as a window into the mind of global capital, its current stance is probably evidence that The Economist believes liberalism is in a stronger position than ongoing discussions of the “crisis of liberalism” would imply. After all, Liberalism at Large can be read as an extended account of liberalism suffering periodic crises and managing to muddle on.



Many of the most urgent problems of our time and of the years to come—prominently including climate change and income inequality—seem to me to require putting markets firmly in their proper place: a subordinate one to other social necessities. Markets are tools, not masters. And the liberalism of political rights like freedom of speech can be decoupled from the liberalism of property rights exemplified by thinking in Wall Street or the City of London. That old laissez-faire individualism is, indeed, not adequate to the tasks we face in 2019. Still, if The Economist, the great barometer of capitalist thinking, can scoff at such sentiments, then it would seem that capitalism is not as imperiled as some might think. Liberalism may be in crisis, but with more than 150 years of experience, it is used to that.










miércoles, 18 de diciembre de 2019

El futuro? del trabajo en nuestros ojos

Estaba buscando un articulo sobre Economist de New Republic, choque con este, que, basicamente sintetiza lo que todo sabemos y es el trabajo en mucho slados hoy (o, al menos el de baja calidad, que, pareceria ser el unico que existira)

enjoy, si pueden

Life Under the Algorithm

How a relentless speedup is reshaping the working class


Henry Noll was one of the most famous workers in American history, though not by his own choice and not under his own name. Employed at Bethlehem Steel for $1.15 a day, and known among workmates for his physical vigor and thriftiness, Noll was—as the somewhat embellished story goes—selected by an ambitious young management consultant named Frederick Winslow Taylor for an experiment in 1899. One day on the job, Taylor approached Noll—whom he later made famous under the pseudonym “Schmidt”—and asked him, “Are you a high-priced man?” As Taylor rendered the story in his book The Principles of Scientific Management, “Schmidt” replied to the obvious trick question cautiously: “Vell, I don’t know vat you mean.”

“Oh yes, you do,” insisted Taylor. “What I want to know is whether you are a high-priced man or not.”


“Vell,” repeated Schmidt, “I don’t know vat you mean.”


“Oh, come now, you answer my questions,” smirked Taylor. “What I want to find out is whether you are a high-priced man or one of these cheap fellows here. What I want to find out is whether you want to earn $1.85 a day or whether you are satisfied with $1.15, just the same as all those cheap fellows are getting.” 


Schmidt then responded that yes, obviously, he would accept the additional 70 cents (“I vas a high-priced man”). Then, the rub: “You see that pile of pig iron?” Taylor explained that a high-priced man did exactly as told, “from morning till night.” Schmidt, whom Taylor compared unfavorably to an “intelligent gorilla,” would be timed and—as we would put it today—optimized in his every movement. “He worked when he was told to work, and rested when he was told to rest.” In this way, Taylor boasted, Schmidt’s output increased from twelve tons of pig iron moved every day to 47. 


This was the primal scene of “scientific management,” versions of which spread rapidly across the world’s workplaces. The bargain between Schmidt and Taylor represented the explicit formulation of what would become the defining compromise of twentieth-century American capitalism: Increase your output, get paid more. Wages go up with productivity.


Until, it turns out, they don’t anymore. The unwinding of this agreement in recent decades, such that workers must continue to produce more without expecting it to show up in their pay stubs, has now been the subject of a good deal of discussion and debate. The decline of unions, the rise of inequality, the crisis of liberal democracy, and the changing face of American culture all, in one form or another, relate to this transformation. We work and work and barely get by, while wealth pools up in obscene quantities out of view. Pile more pig iron, but don’t imagine you’re high-priced. What, ask new books by Emily Guendelsberger and Steve Fraser, is this colossal insult doing to our heads? No wonder, Guendelsberger observes, the country is collectively “freaking the fuck out.”


In her new book, On the Clock: What Low-Wage Work Did to Me and How It Drives America Insane, Guendelsberger re-creates a version of Barbara Ehrenreich’s famous experiment in Nickel and Dimed. Guendelsberger, a reporter for the alt-weekly Philadelphia City Paper until it was sold off and shut down in 2015, went undercover at three low-wage workplaces: an Amazon warehouse in Indiana, a call center in North Carolina, and a McDonald’s in San Francisco. Whereas Ehrenreich’s main discovery was that there still existed an exploited working class—a controversial point in the late 1990s and early 2000s—Guendelsberger takes inequality and exploitation as given, asking instead what these jobs are doing to the millions who work them.


What does the phrase “in the weeds” mean to you? In the professional-managerial class, “in the weeds” signifies knotty detail (as in the Vox public policy podcast, The Weeds). In the working class, Guendelsberger points out, “in the weeds” means the same thing “swamped” does in professional-speak: overwhelmed and stressed out. And America’s working class, Guendelsberger argues, is in the weeds all the time, increasingly subjected to an automated neo-Taylorism. Workers are scheduled by algorithm, their tasks timed automatically, and their performance surveilled digitally. This was what she learned on these jobs: “The weeds are a terribly toxic place for human beings. The weeds make us crazy. The weeds make us sick. The weeds destroy family life. The weeds push people into addiction. The weeds will literally kill you.”


What Guendelsberger found in her experiment was that employers now “demand a workforce that can think, talk, feel, and pick stuff up like humans—but with as few needs outside of work as robots. They insist their workers amputate the messy human bits of themselves—family, hunger, thirst, emotions, the need to make rent, sickness, fatigue, boredom, depression, traffic.” The results are “cyborg jobs,” and they account, by Guendelsberger’s reckoning, for almost half of the American workforce. The hidden moments of reclaimed freedom that make any job bearable are being discovered and wiped out by bosses everywhere: That trick you used to use to slow down the machine won’t work anymore; or that window of 23 minutes when you knew your boss couldn’t watch you is vanishing. Whatever little piece of humanity survived in these fragments dies with them.


In her first job, at an Amazon “fulfillment center,” Guendelsberger finds a regime that is Taylor’s “vision incarnate.” (One co-worker, sensing Taylor’s ghost, theorizes that Amazon is “a sociological experiment on how far a corporation can push people.”) Guendelsberger, a “picker,” is made to carry on her waist a scanner gun, which monitors her location, tells her the precise item among the hundreds of thousands in the warehouse that she is to go pluck from the shelves, its location, and how much time she has to do it. A sliding bar counts down as seconds go by, haranguing her. When she’s identified the shelf in the vast facility, dug through the bin, and scanned the item, the next one appears right away. 


While Amazon warehouses—generally in the ruins of economically depressed cities—often offer better wages than whatever else is around, it’s the time-discipline that kills you. The job is extremely monotonous. (To cope, Guendelsberger sews earbuds into her cap in violation of company policy.) When it’s time for breaks, it takes her so long to reach the exit of the massive warehouse that she must almost immediately turn around and go back to work. On top of the stress, it’s physically painful. The company’s time-off policy, she observes, is literally worse than Scrooge’s in A Christmas Carol. Amazon dispenses free painkillers to workers, and Guendelsberger quickly loses track of how many she is taking. At one point, as she squats down to retrieve an item from a low shelf, her body “mutinies,” she writes. “Stand up, I order my legs for the hundredth time today, but it’s as if they’ve gotten fed up with all the abuse and hung up on my brain. Stand up, you idiot, my brain screams as I slowly topple backwards into a sitting position.” Another worker complains, “My feet are, like, mincemeat. I used to walk twenty miles a day with a backpack on and not change my socks, and they never looked as fucked up as they are now.”


The other jobs more or less go this way, too. At the call center, Convergys, Guendelsberger learns she is the human shield between the frustrated customer and the disdainful, predatory company. (And it turns out you can get MRSA at your workstation if you’re not careful.) At this job, the staff are required to try to push sales on callers throughout the interaction, although customers have generally picked up the phone to try to solve a problem with a cable bill. The aggravated callers take it out on the workers, who must multitask among dysfunctional, incompatible computer systems while empathizing and upselling. Guendelsberger begins imagining herself as multiple personalities: Helper Emily, Sales Emily, Protocol Emily, Scribe Emily, Conversation Emily, Short-Term Memory Emily, Awareness Emily, Journalist Emily, and Boss Emily—who has to monitor all the other ones. “Her job sucks.” Her worst call comes from another call-center worker, using her own lunch break as her only opportunity to try to sort out some service problem. 


Call center workers are monitored, disciplined, and reprimanded for time theft if they try to switch the system off between calls. Guendelsberger, admirably widely read and eclectic, introduced the reader to Taylor in the Amazon section of the book; here she provides a brief lesson on Jeremy Bentham’s Panopticon, tossed in with a bit of evolutionary psychology. How would you act if you knew your supervisor might be watching at any time, or that any customer might blow up at you for reasons you can’t control? You’d be on hair-trigger—all of every day. And your body and brain aren’t built for that. Stress response is supposed to be short-term, fight-or-flight. To do it all of every day is to take a soak in an acid bath. (Guendelsberger conveys this by a parable about a backyard-dwelling, rapidly evolving hominid named Wanda; unaccountably, it works.)


The final workplace, a McDonald’s, leaves the least impression—if only because it’s the most familiar. It’s not hard to imagine why serving fast food is terrible work, even setting aside the poverty wages. “There’s always a line,” writes Guendelsberger. “We’re always in the weeds.” As at the call center, she must interact directly with customers, and attempt to fit their demands into the more-or-less preprogrammed pace of production, which she must also keep moving. She gets cut at one point checking on the coffee—you can never let the coffee run out—when the handle breaks and the pot falls on her. Had she not been wearing pants easily removed from her legs, she’d have been burned badly also, since McDonald’s holds its coffee at near-boiling so it will keep longer. “It frequently feels like we’ve been understaffed at the precise levels that will maximize human misery on both sides of the counter.” 


f McDonald’s is like Convergys in that it involves handling people, it differs in that the unruly customers are right there, in person. They can get in her face. An impatient, bossy one (“Hurry up hurry up hurry up”) demands extra honey mustard from her, which technically she’s not supposed to give. (“Honey mustard! Get me honey mustard!”) Guendelsberger breaks the rule to avoid confrontation. But she’s unsteady with anger, and a packet of condiment slips from her hand and over the counter. “Quick as a shortstop, [the customer] scoops it up and wings it at my chest, hard. The packaging explodes; honey mustard splatters all over me and the surrounding area.” The customer, backed up by a friend, accuses Guendelsberger of having thrown the mustard first. Of course—more victim-blaming. It’s the 2010s.


Seen from Guendelsberger’s point of view, America’s working class is quivering in stress and fear, hurting from torn-up feet, and all covered in honey mustard. The economic miseries inflicted on working-class people are bad enough, but here Guendelsberger has identified something deeper and arguably worse: “Chronic stress drains people’s empathy, patience, and tolerance for new things.” We’ve been brutalized, bullied, and baited into being trained work-animals and not even afforded a corresponding pay bump. No wonder our society fell apart.


“Moloch” is the name Steve Fraser gives to this situation in his new essay collection, Mongrel Firebugs and Men of Property: Capitalism and Class Conflict in American History. Citing Milton and Ginsberg, he writes, “The Moloch of capitalism is as deadly and merciless as its Canaanite ancestor. But its altars are everywhere, virtually invisible yet part of the warp and woof of everyday life: at one moment prayed to on Wall Street, at another configuring the most hidden desires and anxieties of everyone’s emotional life.” These prayers, desires, and anxieties—their histories, their infernal dynamics—are the subject of the book’s eleven essays, which touch on virtually the full sweep of American history. Where Guendelsberger, the plucky reporter, came at the problem up close, Fraser—an eminent labor historian—stands back to try to size the whole thing up. Echoing an old-fashioned style of American Studies scholarship, he’s interested in origin myths and in something like a national psyche.


The essays in Mongrel Firebugs summarize and build on two of Fraser’s recent books, The Limousine Liberal and, especially, The Age of Acquiescence—a late-career magnum opus. While the new book’s contents were largely written for magazine readerships over the last ten years and Fraser approaches them in the loose way of a storyteller, they display his encyclopedic knowledge of U.S. history, especially working-class history. (Fraser’s early career was characterized by pathbreaking original scholarship on the labor movement of the early twentieth century, including a masterful biography of garment workers’ leader Sidney Hillman, Labor Will Rule, and the 1989 edited collection The Rise and Fall of the New Deal Order, which continues to set historians’ agendas today.) 


In his recent work, and especially in the essays collected here, Fraser traces a distinct arc across the history of American capitalism. The nineteenth century was the age of capital’s ravenous growth, consuming all in its path. “It proceeded relentlessly,” he writes, “appropriating land and resources both human and natural that had once been off limits because they were enmeshed in alternative forms of slave, petty, and subsistence economies.” In the face of this social apocalypse, people resisted vigorously, turning the late nineteenth and early twentieth centuries into a period of protracted and often violent social conflict, which he calls a “Second Civil War.” “The legions of the displaced became charter members of an American proletariat. Their new existence was both a promise and a reproach,” writes Fraser.


This is the age of the mass strike and the general strike, events that escaped the confines of any particular employer-employee relationship and became instead the cri de coeur of a whole new world, as for example in the national crusade for the eight-hour day.



It’s likely that Henry “Schmidt” Noll saw some of this action himself: Bethlehem Steel had fierce strikes in 1910, 1918, and 1919.


Decades of such struggles culminated in the New Deal. Workers at Bethlehem Steel, for example, struck again in 1937 and 1941—alongside millions of others around the country in these years. Finally, they won recognition, and they were quickly co-opted into the American mainstream. The conservative compromises that initially stabilized this new order—reinstitutionalized racial and gender hierarchies, coercive deradicalization of labor, private administration of the incomplete welfare state—also left it riddled with contradictions, ultimately producing its decay into neoliberalism in the 1970s.


Here Fraser arrives at his new great subject, the psychic economy of our time. Where the first “Gilded Age” saw enormous resistance to inequality, Fraser argues, ours has seen a distracted, demoralized culture of compliance. Economic risk-taking, positively stigmatized after the Great Depression, is now spoken of in heroic terms: To the risk-taker go the spoils. (Google “risk-taker” and try not to shudder at what you see.) De facto debt servitude and penal labor are back, too, although neither is met with the outrage one might have expected based on earlier historical experience. (Coal miners in Tennessee took up arms in the 1890s to free convict laborers, grasping what the practice portended for themselves.) Unemployment, understood through the late nineteenth century as a grotesque and unacceptable social phenomenon and resisted in spectacular episodes of collective action, is now accepted as natural—cyclical, like the seasons.


Painfully, the most potent strand of resistance instead has been the right-wing populist outrage of the petit bourgeois against the “limousine liberal.” In the book’s later entries, Fraser explores this American demagogic tradition, finding Donald Trump’s clearest predecessor in William Randolph Hearst. Though here, too, he notes, irresponsible populism a century ago required a pro-labor posture. “Today’s right-wing populists are hardly about to invoke the anti-capitalism that impassioned the people Hearst counted on. On the contrary, what draws them to The Donald is that he is an übermensch risen atop the capitalist order.” Trump in this way only exemplifies the phenomenon of the ascent of the family capitalists—like the Kochs, Waltons, and so on—in whose hands enormous wealth has accumulated in recent years, and who, liberal and reactionary alike, manifest “godlike desire to create the world in their image.” The worship they receive, at its apex in Trump’s presence in the White House, suggests that their apotheosis has been successful—“the genie grown monstrous,” as Fraser puts it.


For Fraser, the cause of this deep ideological transformation lies in the altered “metabolism” of capitalism. Where once it produced upheaval by swallowing everything it could chew, today its systems are basically expulsive: unemployment and exclusion, rather than coerced assimilation and employment. “The gears of Progress, that demiurge of the first Gilded Age, were set in reverse,” Fraser writes. Capitalism “autocannibalized” itself, and the spirit of the new age was accordingly the dejection of the social reject, not the outrage of the unwilling conscript.

 

Indeed, Fraser can’t help but telegraph his own dejection. “There is abroad in the world the spirit of Moloch,” he concludes, “luridly lighting up the abyss out of which Trump has emerged.” While his last lines call for renewed dreams of emancipation, he hasn’t devoted much space to searching out where such dreams might come from, and doesn’t seem to have much faith that they’ll materialize. Here the gap between Fraser’s defeated New Left generation and Guendelsberger’s defiant Millennials looms large.


The generational difference is political, but it’s also sociological. Guendelsberger, unlike her direct predecessor Ehrenreich, isn’t exactly slumming it—she doesn’t have as far to fall. Through much of Nickel and Dimed, Ehrenreich is tormented by the ethical implications of the social distance between herself and her co-workers. Guendelsberger, on the other hand, is fairly unbothered on this count; she was already unemployed when she embarked on her project. She sleeps in her car for significant portions of the narrative, and accepts the charity of her workmates gratefully. She began the book, in fact, on spec—she only got her contract during her second stint, at Convergys. “Even if nothing came of it, I figured, I would at least bank a couple thousand bucks.” 


The contrast with the conception of Nickel and Dimed—brainstormed over salmon with Lewis Lapham—is a perfect index of what the last 20 years have done to the once-secure professional strata. Ehrenreich set out to rediscover the lost land of the working class as a self-conscious representative of the complacent middle class, in order to send word back and stimulate the numbed yuppie conscience. After another generation of neoliberalism, the line between these two groups has blurred, so this interpreter act seems less urgent. Guendelsberger herself straddles the line, and she imagines her reader does, too. “Yeah, you, mamá,” she writes in her afterword, hailing the reader the way workers addressed each other at McDonald’s. “You’re a worker too—just like me and Jess and Zeb and Candela and Kolbi and Miguel and the Mustard Lady.”


To be sure, the places Guendelsberger went to work are saturated with the poisonous ideologies Fraser explores. The Convergys staff are continuously surveilled for “time theft” while the employer steals time from workers left and right. The “Amazonians” are told repeatedly that they’re making history, and many seem to believe it. Complaining co-workers are often dismissed as ingrates. (“if You think Amazon is bad, try McDonalds you McBitches,” an online commenter scolds.) One warehouse workmate, “Blair,” both frets constantly about following the rules and aspires to beat the world record for fastest picker. She hopes in this way to prove that humans will always beat robots. As Guendelsberger observes, Blair resembles John Henry, the mighty, tall-tale figure who raced against the new steam drill, blasting through mountainside with only his hammer—winning, but dying with his hammer in his hand.


The Steel-Driving Man—likely a black convict laborer, and a slight physical figure in reality—was memorialized in what became one of the most popular American folk songs from the turn of the twentieth century through the Great Depression. Around the country, laborers kept pace with their machines, intoning, “I’ll die with this hammer in my hand.” John Henry, threatened and ultimately killed by the machine, yet still triumphant, became one of the most potent symbols of workers’ explosive resistance to primitive accumulation. His legend, as the historian Scott Reynolds Nelson shows in his extraordinary book Steel Drivin’ Man, resonated across sectors of the new proletariat that shared nothing but a common hostility to the new order.


On the other hand, you may bet safely that Uber drivers, adjunct professors, and home health aides will not pass away their own toilsome hours by singing songs about Blair’s race with the algorithm. Blair is doing what John Henry did, but the act’s meaning is inverted: It signifies the power of the boss’s ideology, not its rejection. She’s the perfect example of Fraser’s argument. 


Because Guendelsberger is herself a precarious journalism worker, she has little trouble discovering and slipping into the currents of solidarity that flow under the surface in almost all workplaces. Labor in capitalism is nearly always, in some way, social. Subdivided over and over by Taylor and those who came before and after him, capitalist production requires that people work together. No matter how hard management tries to keep them from getting to know and trust each other, they always will, at least a little. “We’re all in this together against the stopwatches and the sharks,” writes Guendelsberger. (She deploys an extended shark metaphor at one point.) “And we may be only human, but there’s a whole lot of us.” It is this social aspect of labor that is the key to unlock the ideological prison that Fraser describes. Guendelsberger concludes the book with a prediction: “You’ll meet other people who think the status quo is cruel and ridiculous—they’re literally everywhere.… You’ll come to feel a bond with them that’s stronger than friendship. You’ll become part of something bigger than yourself—and weirdly, you’ll feel more in control of your life than you have in years.”


Guendelsberger worked at Amazon during early winter. She writes of the stress of the holiday season as a horrible speedup, a kind of waking nightmare: She can’t control her tormented body, she’s bored, stressed, and depressed all at once. But, it turns out, this isn’t the only way to experience the busy season. With a week to go until Christmas, she finds her way to a tent village where a group of temporary workers are staying. They have mini-pizzas, and she brings beer and some cookies. They tell Guendelsberger she’s got it all wrong—she’s been working much too hard. You only need to make rate if you’re trying to get promoted and stick around for a long time. Explains one named Matthias, “They need us there more than they’re paying us.” Testing the limits, he managed to take 48 extra minutes off before lunch recently before they came and talked to him. He points out, “‘The facility as a whole was already operating at 110 percent—at that point, what the hell does it actually matter?’” Matthias says, “affecting a cheery, brainwashed tone, ‘We’re Making History! Exceeding Expectations!’”


A group of transient temps taking long breaks and mocking Jeff Bezos around a campfire isn’t a revolution, but it’s not nothing either. As Guendelsberger says, some version of this is, necessarily, everywhere. On your own, it’s hard to know whether you really do need to make rate, or what to do when someone throws mustard at you and they say you started it. It’s easy to crack under this pressure: Moloch is powerful and frightening. But the thing about false gods is that they truly cannot abide being mocked, and there’s always someone else who sees through it, too—more, in fact, every day. The boss may have an all-seeing panopticon, but the prehistory of every strike begins when one worker catches another’s eye.

ENJOY!

martes, 17 de diciembre de 2019

HP

Esto va a ser una serie de post, afanados de wikipedia y varios lados, sobre HP.
HP fue, de alguna forma, el epitome de la cultura corporate americana, fue fundada en los 40s por Hewlett y Packard, y fue el semillero de muchas cosas, desde el primer oscilador de audio en serio, a la impresora de chorro de tinta, pasando por millones de cosas, pero, dejemos a los fundadores:
Nota: Como Electronico tengo un afecto especial por los fundadores de Silicon Valley y discipulos de Terman, en algun lugar aparecio la discusion de quien fue el cientifico que mas aporto en la 2da guerra, algunos dijeron Turing, nadie recordo a Vannevar Bush, ese sera otro post



viernes, 7 de junio de 2019

Nostalgia

A veces algo personal, escuche un tema y me dio nostalgia de Mardel, es un clasico, no podia ser de otra forma


miércoles, 22 de mayo de 2019

Corea del Centro

A veces Moreno divierte, podre o no estar de acuerdo con el, pero hay veces cuando se les para de manos al star system y a los prejuicios dirigidos, que no se puede dejar de estar de acuerdo, el coreacentrismo y la correccion politica total no es lo de el, y, quizas este bien en este estadio


lunes, 20 de mayo de 2019

Vaca Muerta y velocidad de monetizacion


una breve coleccion de pensamientos acerca de eso, o, titulos disparadores a futuro (existe uno dirian  Sex Pistols?)




1. Monetizemos rapido, el futuro pertenece a las renovables!!! Falacia, el gas no solo sirve para combustible.

2. Monetizamos rapido, y paguemos deuda o vivamos mejor! Falacia, al final solo quedaran los impuestos directos y las regalias a las Pcias, entre pagos de equipos y tecnologia se ira toda la guita, y, si te descuidas, ni ganancias se va a cobrar (ejemplo de hoy, tasa de estadistica a cero para equipos a V Muerta)

3. Monetizemos rapido, el recurso esta y dejamos de importar!!! Falacia, hoy el gas es un transable, el problema es cuanto pagas lo importado, el error consiste en pesar que el local debe ser mas barato.Siempre hay que pagar

4. Monetizemos rapido, es muy eficiente hoy YPF y otros!!! Falacia, mas de la mitad de la eficiencia de YPF se deb a que los equipos de superficie ya estaban de antes, solo tuvieron que poner perforacion horizontal, no vertical (no solo YPF ojo)

5. Monetizemos rapido, y usemos la plata para tecnologia!!! Si me pedis plata, en cuanto me la devolves? a hoy, es todo sarasa

en fin

viernes, 17 de mayo de 2019

Pensiones, Jubilaciones y Cuaron

Ayer leyendo www.elgatoylacaja.com aterrize en una especie de panegirico de jubilaciones pensiones de ama de casa, etc etc, y, quizas fui aspero (whatever) diciendo algunas cosas.

Amen que, como alguno dijo, Cuaron solo trato de exorcizar sus demonios y estan trasladando realidades no locales, el blanco vs el mestizo en la zona centro no era tan rigido

Para clarificar, siempre hubo un sistema para que aportaran quienes no estaban en relacion de dependencia, se llamaba autonomos, siempre hubo un sistema para los que tenian mas de 70 años no contributivo, lo llamaban Pension.

Lo unico nuevo es el haber permitido por una especie de moratoria, y digo especie por que lo que se pagaba no era indexado, a determinado genero, con determinada edad, y nada mas, lo cual puede ser justo en algunos casos, en otros?
Y, a aquellos que pagaron religiosamente autonomos, que hasta el 93 no tenia demasiados requisitos, y despues tenian el monotributo fueron estafados, los que no pagaron no supieron, no quisieron o no pudieron, como dijo algun procer de la democracia, pero tampoco a aquellos que sacrificaron cosas por pagar jubilacion, amen que a los 65 se jubilaron autonomos, lo hicieron de balde (y soy elegante con el termino).

Lo cual demuestra que en este pais siempre el estado sale al rescate de todos, y en el todos estan quienes lo necesitan y quienes no, el proximo capitulo sera Creditos UVA, que, como los que sobrevivieron la 1050, los que vieron pesificadas sus deudas en el 2002 , los que recibieron prestamos del BHN siempre, al final fueron subsidiados por el conjunto de la poblacion, y, quienes realmente necesitan, alpiste.

en fin

miércoles, 15 de mayo de 2019

Nerd, so?

Es largo, muy largo


NERDS, WE DID it. We have graduated, along with oil, real estate, insurance, and finance, to the big T. Trillions of dollars. Trillions! Get to that number any way you like: Sum up the market cap of the major tech companies, or just take Apple’s valuation on a good day. Measure the number of dollars pumped into the economy by digital productivity, whatever that is. Imagine the possible future earnings of Amazon.
THE THINGS WE loved—the Commodore Amigas and AOL chat rooms, the Pac-Man machines and Tamagotchis, the Lisp machines and RFCs, the Ace paperback copies of Neuromancer in the pockets of our dusty jeans—these very specific things have come together into a postindustrial Voltron that keeps eating the world. We accelerated progress itself, at least the capitalist and dystopian parts. Sometimes I’m proud, although just as often I’m ashamed. I am proudshamed.
And yet I still love the big T, by which I mean either “technology” or “trillions of dollars.” Why wouldn’t I? I came to New York City at the age of 21, in the era of Java programming, when Yahoo! still deserved its exclamation point. I’d spent my childhood expecting nuclear holocaust and suddenly came out of college with a knowledge of HTML and deep beliefs about hypertext, copies of WIRED (hello) and Ray Gun bought at the near-campus Uni-Mart. The 1996 theme at Davos was “Sustaining Globalization”; the 1997 theme was “Building the Network Society.” One just naturally follows the other. I surfed the most violent tsunami of capital growth in the history of humankind. And what a good boy am I!
My deep and abiding love of software in all its forms has sent me—me—a humble suburban Pennsylvania son of a hard­scrabble creative writing professor and a puppeteer, around the world. I lived in a mansion in Israel, where we tried to make artificial intelligence real (it didn’t work out), and I visited the Roosevelt Room of the White House to talk about digital strategy. I’ve keynoted conferences and camped in the backyard of O’Reilly & Associates, rising as the sun dappled through my tent and emerging into a field of nerds. I’ve been on TV in the morning, where the makeup people, who cannot have easy lives, spackled my fleshy Irish American face with pancake foundation and futilely sought to smash down the antennae-like bristle of my hair, until finally saying in despair, “I don’t know what else to do?” to which I say, “I understand.”
When I was a boy, if you’d come up behind me (in a nonthreatening way) and whispered that I could have a few thousand Cray supercomputers in my pocket, that everyone would have them, that we would carry the sum of human ingenuity next to our skin, jangling in concert with our coins, wallets, and keys? And that this Lilliputian mainframe would have eyes to see, a sense of touch, a voice to speak, a keen sense of direction, and an urgent desire to count my actual footsteps and everything I read and said as I traipsed through the noosphere? Well, I would have just burst, burst. I would have stood up and given the techno­barbaric yawp of a child whose voice has yet to change. Who wants jet packs when you can have 256 friggabytes (because in 2019 we measure things in friggin’ gigabytes) resting upon your mind and body at all times? Billions of transistors, attached to green plastic, soldered by robots into a microscopic Kowloon Walled City of absolute technology that we call a phone, even though it is to the rotary phone as humans are to amoebas­. It falls out of my hand at night as I drift to sleep, and when I wake up it is nestled into my back, alarm vibrating, small and warm like a twitching baby possum.
Software partially raised me, and it’s such a patient teacher.
still love software. It partially raised me and is such a patient teacher. Being tall, white, enthusiastic, and good at computers, I’ve ended up the CEO of a software services company, working for various large enterprises to build their digital dreams—which you’d figure would be like being a kid in a candy store for me, sculpting software experiences all day until they ship to the web or into app stores. Except it’s more like being the owner of a candy factory, concerned about the rise in cost of Yellow 5 food coloring and the lack of qualified operators for the gumball-forming machine. And of course I rarely get to build software anymore.
I would like to. Something about the interior life of a computer remains infinitely interesting to me; it’s not romantic, but it is a romance. You flip a bunch of microscopic switches really fast and culture pours out.
A few times a year I find myself walking past 195 Broadway, a New York City skyscraper that has great Roman columns inside. It was once the offices of the AT&T corporation. The fingernail-sized processor in my phone is a direct descendant of the transistor, which was invented in AT&T’s Bell Labs (out in New Jersey). I pat my pocket and think, “That’s where you come from, little friend!” When the building was constructed, the company planned to put in a golden sculpture of a winged god holding forked lightning, called Genius of Telegraphy.
But by the time the building was finished AT&T had sold off the telegraph division, so the company called it Spirit of Electricity. But that must have been too specific, because it was renamed Spirit of Communication. And then in 1984, the Bell system, after decades of argument about its monopoly status, broke up (with itself and with America).
Now the New York offices are rented out to, among other things, a wedding planning website and a few media companies. The statue has been relocated to Dallas. Today everyone calls it Golden Boy.

IN THE LATE 1990s I was terrified of mailing lists. For years the best way to learn a piece of software—especially some undocumented, open sourced thing you had to use to make websites—was to join its community and subscribe to its mailing lists, tracking the bugs and new releases. Everything was a work in progress. Books couldn’t help you. There was no GitHub or Stack Overflow.
I could only bring myself to lurk, never to contribute. I couldn’t even ask questions. I was a web person, and web people weren’t real programmers. If I piped up, I was convinced they’d yell, “Get off this mailing list! You have no place in the community of libxml2! Naïf!” The very few times I submitted bugs or asked questions were horrible exercises in rewriting and fear. Finally I’d hit Send and—
Silence, often. No reply at all. I’d feel awful, and a little outraged at being ignored. I was trying so hard! I’d read the FAQs!
Eventually I met some of those magical programmers. I’d sneak into conferences. (Just tell the people at the entry you left your badge in the hotel room.) They were a bunch of very normal technologists contributing, through their goodwill and with their spare time, to open source software tools.
“I use your code every day,” I’d say. They were pleased to be recognized. Surprised at my excitement. They weren’t godlike at all. They were, in many ways, the opposite of godlike. But I am still a little afraid to file bug reports, even at my own company. I know I’m going to be judged.
So much about building software—more than anyone wants to admit—is etiquette. Long before someone tweeted “That’s not OK!” there were netiquette guides and rule books, glossaries, and jargon guides, like The New Hacker’s Dictionary, available in text-only format for download, or Hitchhiker’s Guide to the Internet, first released in 1987. Bibles. There were the FAQs that would aid newcomers to the global decentralized discussion board Usenet. FAQs kept people from rehashing the same conversation. When college freshmen logged on in September—because that’s where the internet happened back in the 1980s and ’90s, at colleges and a few corporations—they would be gently shown the FAQs and told how to behave. But then in 1993, AOL gave its users Usenet access—and that became known as the Eternal September. The ivory tower was overrun. That was the day the real internet ended, 26 years ago. It was already over when I got here.
The rulemaking will never end. It’s rules all the way down. Coders care passionately about the position of their brackets and semicolons. User experience designers work to make things elegant and simple and accessible to all. They meet at conferences, on message boards, and today in private Slacks to hash out what is good and what is bad, which also means who is in, who is out.
I keep meeting people out in the world who want to get into this industry. Some have even gone to coding boot camp. They did all the exercises. They tell me about their React apps and their Rails APIs and their page design skills. They’ve spent their money and time to gain access to the global economy in short order, and often it hasn’t worked.
I offer my card, promise to answer their emails. It is my responsibility. We need to get more people into this industry.
But I also see them asking, with their eyes, “Why not me?”
And here I squirm and twist. Because—because we have judged you and found you wanting. Because you do not speak with a confident cadence, because you cannot show us how to balance a binary tree on a whiteboard, because you overlabored the difference between UI and UX, because you do not light up in the way that we light up when hearing about some obscure bug, some bad button, the latest bit of outrageousness on Hacker News. Because the things you learned are already, six months later, not exactly what we need. Because the industry is still overlorded by people like me, who were lucky enough to have learned the etiquette early, to even know there was an etiquette.
I try to do better, and so does my company. How do you change an industry that will not stop, not even to catch its breath? We have no leaders, no elections. We never expected to take over the world! It was just a scene. You know how U2 was a little band in Ireland with some good albums, and over time grew into this huge, world-spanning band-as-brand with stadium shows with giant robotic structures, and Bono was hanging out with Paul Wolfowitz? Tech is like that, but it just kept going. Imagine if you were really into the group Swervedriver in the mid-’90s but by 2019 someone was on CNBC telling you that Swervedriver represented, I don’t know, 10 percent of global economic growth, outpacing returns in oil and lumber. That’s the tech industry.

No one loves tech for tech’s sake. All of this was about power—power over the way stories were told, the ability to say things on my own terms. The aesthetic of technology is an aesthetic of power—CPU speed, sure, but what do you think we’re talking about when we talk about “design”? That’s just a proxy for power; design is about control, about presenting the menu to others and saying, “These are the options you wanted. I’m sorry if you wanted a roast beef sandwich, but sir, this is not an Arby’s.” That is Apple’s secret: It commoditizes the power of a computer and sells it to you as design.
Technology is a whole world that looks nothing like the world it seeks to command. A white world, a male world, and—it breaks my heart to say it, for I’ve been to a lot of Meetups (now a WeWork company), and hosted some too—a lonely world. Maybe I’m just projecting some teenage metaphysics onto a lively and dynamic system, but I can’t fully back away from that sense of monolithic loneliness. We’re like a carpenter who spent so long perfecting his tools that he forgot to build the church.
But not always. One night in October 2014, I had a few drinks and set up a single Linux server in the cloud and called it tilde.club, then tweeted out that I’d give anyone an account who wanted one. I was supposed to be working on something else, of course.
Suddenly my email was full: Thousands of people were asking for logins. People of all kinds. So I made them accounts and watched in awe as they logged on to that server. You can put hundreds of people on one cheap cloud computer. It’s just plain text characters on a screen, like in the days of DOS, but it works. And they can use that to make hundreds of web pages, some beautiful, some dumb, exactly the way we made web pages in 1996. Hardly anyone knew what they were doing, but explaining how things worked was fun.
For a few weeks, it was pure frolic. People made so many web pages, formed committees, collaborated. Someone asked if I’d sell it. People made their own tilde servers. It became a thing, but an inclusive thing. Everyone was learning a little about the web. Some were teaching. It moved so fast I couldn’t keep up. And in the end, of course, people went back whence they came—Twitter, Facebook, and their jobs. We’d had a very good party.
The server is still up. Amazon sends a bill. I wish the party could have kept going.
But briefly I had made a tiny pirate kingdom, run at a small loss, where people were kind. It was the opposite of loneliness. And that is what I wish for the whole industry. Eternal September is not to be hated, but accepted as the natural order of success. We should invite everyone in. We should say, We’re all new here.

“Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind.” This was John Perry Barlow’s “A Declaration of the Independence of Cyberspace,” a document many people took seriously, although I always found it a little much. Barlow was a prophet of network communication, an avatar of this magazine. “On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.” It’s signed from Davos, 1996 (the year of “Sustaining Globalization”).
Exposure to the internet did not make us into a nation of yeoman mind-farmers (unless you count Minecraft). That people in the billions would self-assemble, and that these assemblies could operate in their own best interests, was … optimistic.
But maybe! Maybe it could work. There was the Arab Spring, starting in 2010. Twitter and Facebook were suddenly enabling protest, supporting democracy, changing the world for the better. This was the thing we’d been waiting for—
And then it wasn’t. Autocracy kept rearing its many heads, and people started getting killed. By 2014, Recep Tayyip Erdoğan was shutting off Twitter in Turkey to quell protests, and then it came home, first as Gamergate, wherein an online campaign of sexual harassment against women, somewhat related to videogames, metastasized into an army of enraged bots and threats. And as Gamergate went, so went the 2016 election. It was into this gloomy context that I made tilde.club that night—a blip of nostalgia and cheer fueled by a few Manhattans.
People—smart, kind, thoughtful people—thought that comment boards and open discussion would heal us, would make sexism and racism negligible and tear down walls of class. We were certain that more communication would make everything better. Arrogantly, we ignored history and learned a lesson that has been in the curriculum since the Tower of Babel, or rather, we made everyone else learn it. We thought we were amplifying individuals in all their wonder and forgot about the cruelty, or at least assumed that good product design could wash that away. We were so hopeful, and we shaved the sides of our heads, and we never expected to take over the world.
I’m watching the ideologies of our industry collapse. Our celebration of disruption of every other industry, our belief that digital platforms must always uphold free speech no matter how vile. Our transhumanist tendencies, that sci-fi faith in the singularity. Our general belief that software will eat the world and that the world is better for being eaten.
.
It’s been hard to accept, at least for me, that each of our techy ideologies, while containing various merits, don’t really add up to a worldview, because technology is not the world. It’s just another layer in the Big Crappy Human System along with religion, energy, government, sex, and, more than anything else, money.
I don’t know if I can point to any one thing and say “that’s tech” in 2019. (Well, maybe 3D graphics GPU card programming. That’s nerd central.) The cost of our success is that we are no longer unique. The secret club is no longer a gathering of misfits. We are the world. (We are the servers. We are the ones who gather faves and likes, so let’s start clicking. Sorry.)
I’ve made a mistake, a lifelong one, correlating advancements in technology with progress. Progress is the opening of doors and the leveling of opportunity, the augmentation of the whole human species and the protection of other species besides. Progress is cheerfully facing the truth, whether flooding coastlines or falling teen pregnancy rates, and thinking of ways to preserve the processes that work and mitigate the risks. Progress is seeing calmly, accepting, and thinking of others.
It’s not that technology doesn’t matter here. It does. We can enable humans to achieve progress. We make tools that humans use. But it might not be our place to lead.
I wish I could take my fellow CEOs by the hand (they’re not into having their hands held) and show them Twitter, Facebook, Tumblr, and any of the other places where people are angry. Listen, I’d say, you’re safe. No one is coming for your lake house, even if they tweet “I’m coming for your lake house.” These random angry people are merely asking us to keep our promises. We told them 20-some years ago that we’d try to abolish government and bring a world of plenty. We told them we’d make them powerful, that we’d open gates of knowledge and opportunity. We said, “We take your privacy and security seriously at Facebook.” We said we were listening. So listen! They are submitting a specification for a world in which fairness is a true currency, and then they’re trying to hold everyone to the spec (which is, very often, the law). As someone who spent a lot of time validating XML and HTML pages, I empathize. If bitcoin can be real money, then fairness can be a real goal.
We might have been them, if we’d been born later and read some different websites. And it’s only time before they will become us.
EVERY MORNING I drop off my 7-year-old twins, a boy and a girl, at their public school, and they enter a building that was established a century ago and still functions well for the transmission of learning, a building filled with digital whiteboards but also old-fashioned chalkboards and good, worn books.
I think often of the things the building has seen. It was built in an age of penmanship and copybooks, shelves of hardbound books and Dick and Jane readers; it made its way through blue mimeographs with their gasoline smell. Milkmen delivered with horses when it was built, and now every parking space is filled with Toyotas and school buses. Teachers and principals come young and retire decades later. There are certain places where craft supplies are stored. The oldest living student just turned 100 years old, and some students walked to his home and sang him “Happy Birthday.” They announced it at the multicultural music event.
The school hasn’t moved in a century, but it is a white-hot place in time. Ten or twenty thousand little bodies have come through here on their way to what came next. While they are here, it’s their whole world. It feeds the children who need to be fed.
I watch my kids go through the front doors. (I call this my “cognitive receipt,” because unless I see them I worry that I somehow forgot to drop them off.) Then I walk to the bus stop. The bus comes, and off we go, across an elevated highway and through a tunnel. Then we take the FDR Expressway and drive right under three bridges: the Brooklyn, the Manhattan, the Williamsburg. Each bridge has its own story, an artifact of its time, products of various forms of hope, necessity, and civic corruption, each one an essay on the nature of gravity and the tensile strength of wire. Everyone on the bus looks at their phone or looks out the window, or sometimes they read a book.
Sometimes I think of the men who died making the Brooklyn Bridge; sometimes I play a game on my phone. This is as close as it gets to the sacred for me, to be on a public conveyance, in the arms of a transit authority, part of a system, to know that the infrastructure has been designed for my safety. In the winter, I can look down into the icy East River and fantasize about what it would take to push us into the river, because only a small, low concrete barrier keeps us from death. I think of how I’d escape and how I’d help others up. But the bus never hurtles into the water. They made sure of it.

I know that my privacy is being interfered with, that I’m being watched, monitored, tracked by giant companies, and that I’m on video. (I wish I’d known how often I’d be on video in 2019, how often I’d need to see my own animated face in the corner of the video call.) I know also that I have been anticipated by the mineralogists who study asphalt and that I am surrounded by tolerances and angles, simple and complex machines.
My children are safe in an old, too-warm building that has seen every system of belief and every kind of education, one that could easily last another 100 years, with glowing lichen on the wall in place of lights. Imagine how many light-emitting sneakers they’ll have by then.
Maybe I should have moved to the Bay Area to be closer to this industry I love, and just let myself fall backward into tech. I could never muster it, even though I studied maps of San Francisco and pushed my wife to come with me and visit the corporate campuses of Apple, Google, and the like, which meant visiting a lot of parking lots.
But I didn’t move. I stayed in New York, where on a recent Saturday I went to the library with my kids. It’s a little one-story library, right next to their school, and it’s as much a community center as repository of knowledge. I like quiet, so sometimes I get annoyed at all the computers and kids, the snacking moms and dads. But it’s 2019 and I live in a neighborhood where people need public libraries, and I live in a society.
When we visited one day in February, there was a man in a vest behind me setting up some devices with wire and speakers. He was trying to connect two little boxes to the devices and also to two screens, and calling gently to a passing librarian for a spare HDMI cable. Kids were coming up and looking. They were particularly interested in the cupcakes he’d brought with him.
“We’re having a birthday party,” he said, “for a little computer.”

By which he meant the Raspberry Pi. Originally designed in the UK, it’s smaller than a can of soda and runs Linux. It costs $35. It came into the world in February 2012, sold as a green circuit board filled with electronics, with no case, nothing, and became almost instantly popular. In that and subsequent versions, 25 million units have been sold. A new one is much faster but basically the same size, and still costs $35.
But for the terrible shyness that overcame me, I would have turned around right there and grasped that man’s hand. “Sir,” I would like to have said, “thank you for honoring this wonderful device.”
You get your Raspberry Pi and hook it up to a monitor and a keyboard and a mouse, then you log on to it and … it’s just a Linux system, like the tilde.club machine, and ready for work. A new computer is the blankest of canvases. You can fill it with files. You can make it into a web server. You can send and receive email, design a building, draw a picture, write 1,000 novels. You could have hundreds of users or one. It used to cost tens of thousands of dollars, and now it costs as much as a fancy bottle of wine.
I should have said hello to the man in the library. I should have asked my questions on the mailing lists. I should have engaged where I could, when I had the chance. I should have written fan letters to the people at Stanford Research Institute and Xerox PARC who bootstrapped the world I live inside. But what do you say? Thank you for creating a new universe? Sorry we let you down?
What do you say? Thank you for creating a new universe? Sorry we let you down?
We are all children of Moore’s law. Everyone living has spent the majority of their existence in the shadow of automated computation. It has been a story of joy, of mostly men in California and Seattle inventing a future under the occasional influence of LSD, soldering and hot-tubbing, and underneath it all an extraordinary glut of the most important raw material imaginable—processor cycles, the result of a perfect natural order in which the transistors on the chips kept doubling, speeds in the kilo-, mega-, and eventually gigahertz, as if the camera had zoomed in on an old IBM industrial wall clock that sped up until its minute hand was a blur, and then the hour hand, and then the clock caught fire and melted to the ground, at which point money started shooting out of the hole in the wall.
There is probably no remaining growth like what we’ve seen. Attempts to force a revolution don’t seem to work. Blockchain has yet to pan out. Quantum computing is a long and uncertain road. Apple, Google, and their peers are poised to get the greatest share of future growth. Meanwhile, Moore’s law is coming to its natural conclusion.
I have no desire to retreat to the woods and hear the bark of the fox. I like selling, hustling, and making new digital things. I like ordering hard drives in the mail. But I also increasingly enjoy the regular old networks: school, PTA, the neighbors who gave us their kids’ old bikes. The bikes represent a global supply chain; when I touch them, I can feel the hum of enterprise resource planning software, millions of lines of logistics code executed on a global scale, bringing the handlebars together with the brakes and the saddle onto its post. Then two kids ride in circles in the supermarket parking lot, yawping in delight. I have no desire to disrupt these platforms. I owe my neighbors a nice bottle of wine for the bikes. My children don’t seem to love computers as I do, and I doubt they will in the same way, because computers are everywhere, and nearly free. They will ride on different waves. Software has eaten the world, and yet the world remains.

We’re not done. There are many birthdays to come for the Raspberry Pi. I’m at the office on a Sunday as I write this. My monitor is the only light, and if you could see me I’d be blue.
I’m not sure if I should be a CEO forever. I miss making things. I miss coding. I liked having power over machines. But power over humans is often awkward and sometimes painful to wield. I wish we’d built a better industry.
I was exceptionally lucky to be born into this moment. I got to see what happened, to live as a child of acceleration. The mysteries of software caught my eye when I was a boy, and I still see it with the same wonder, even though I’m now an adult. Proudshamed, yes, but I still love it, the mess of it, the code and toolkits, down to the pixels and the processors, and up to the buses and bridges. I love the whole made world. But I can’t deny that the miracle is over, and that there is an unbelievable amount of work left for us to do.