• Building Blocks of a Digital Caste Panopticon : Everyday Brahminical Policing in India
    https://logicmag.io/policy/building-blocks-of-a-digital-caste-panopticon-everyday-brahminical-policing

    Un voyage en Inde aujourd’hui c’est un peu comme un voyage à Berlin en 1936.

    13.12.2023 by Nikita Sonavane, Mrinalini R, Aditya Rawat, Ramani Mohanakrishnan and Vikas Yadav

    In August 2022, in the southern Indian state of Telangana, an American dream came to fruition: the completion of a sprawling, state-of-the-art headquarters for policing and surveillance. Located in the heart of Hyderabad, the state’s capital, the structure—christened the Integrated Control and Command Centre—is proudly described by the Telangana Police Department as the Indian equivalent to the New York City Police Department’s 1 Police Plaza. Comprised of five enormous twenty-story glass buildings, this “world-class construction” is the first of its kind in India—a prestige value not lost on the ruling political party in the state, which has wasted no time mobilizing it as a public symbol of the long struggle for an independent state of Telangana.

    The Command Centre’s origins date to soon after the state’s formation in 2014, when Chief Minister K. Chandrasekhar Rao introduced significant changes to Telangana’s policing system to attract foreign investors—a vision shared by his predecessor, Chandrababu Naidu, who had received World Bank backing to develop Hyderabad into India’s next “Silicon Valley”. Policy prescriptions sought to position the state as the center of informational growth, making Hyderabad a neoliberal urban lab with policing as the central canon to create “safe” spaces for foreign investments. This logic was seen as complementary to “securing” the increasing number of women in public spaces. A slew of laws concerning policing were introduced in the name of “women’s safety.” In September 2014, for instance, $203.8 million was earmarked for the overhaul of Telangana police, of which Hyderabad’s city department was allocated $82.5 million. There are now 925,000 closed-circuit TVs across Telangana, including CCTVs installed by private entities and communities. Of these, nearly 370,000 are in Hyderabad alone.

    In preparation for the overhaul of government surveillance apparatus, Telangana also conducted an unprecedented one-day socioeconomic survey, the largest in India, called the Kutumba Samagra Survey. It sought to enumerate ninety-four parameters of civic life by recording details of individuals and their family members, disabilities and chronic diseases, housing, and movable assets. Launched by the state planning commission, this was part of the 360-degree profile that the state government was building on its citizens, known as the Integrated People Information Hub. While similar population databases are now cropping up across the Indian states, Telangana’s survey is an anomaly: not only did the police department play a crucial part in compiling the data, but it also hosted the data for two years before transferring it to the IT department. According to one South India–based tech researcher who explained to us, “The surveys are often conducted by the Rural Development department, Planning or some special initiatives. While the data sits with this e Governance entity under the IT department, it’s an informal arrangement through which the Police have access.”

    The state’s focus on safety and development has largely been predicated on the “visibility of policing,” which is characterized by a host of digital tools geared toward making the police more “efficient” as well as more omnipresent. Hyderabad has served as the model for such “smart policing,” particularly since the rise of Hindu nationalism in the country and its criminalization of Muslims. Much like its idolized counterpart, the New York City Police Department, the Hyderabad police has used technological prowess to intensify its relentless gaze upon the lives and bodies of oppressed communities in the city. Casteist digital datafication of marginalized bodies—in this case those from oppressed caste groups—has been a crucial part of the creation and entrenchment of this surveillance project.

    Drawing on the philosopher Michel Foucault’s analysis of modern disciplinary societies, combined with his later conceptualization of power in the form of governmentality, our contention revolves around the disconcerting trend where tools like surveys, when combined with unbridled policing technologies, set in motion what we call the digital caste panopticon. With the case of Telangana, we argue that growing digitization of policing solidifies the confluence of colonialism and caste. We use the term “Brahminical policing” to refer to a violently imposed ideology that positions Brahmins as the epitome of purity in the hierarchy of caste, and that includes within it the eugenic tenets of colonialism. This form of policing extends its reach well beyond ostensibly public spaces, into the everyday lives of marginalized communities. In our analysis, we also challenge the notion of castelessness in the framing of religion, urging the deployment of an anti-caste framework to engage with questions of policing in the Indian subcontinent, and perhaps beyond.
    Caste-ing Criminalization of Muslims in Hyderabad

    Hyderabad, given its heritage of Nizam (a Muslim monarch family) rule until Indian independence, has been home to a rich Muslim culture. Today, they constitute 43 percent of the population, against the national average of 14 percent (and 12 percent in the state of Telangana). Despite this, a 2016 government report noted that over 81 percent of Muslims in the state were recognized as socially and educationally backward classes. Another report of a survey of Muslims in the Old City area of Hyderabad notes that over 87 percent were employed as daily wager laborers or skilled labor as auto drivers, plumbers, electricians, butchers, mechanics, and so on. As scholars like Khalid Ansari note, the popular framing of “Islamophobia” as the main trick in the playbook of the ruling dispensation obscures stratification among Muslim communities. For instance, the 2016 report cited above concludes that certain “occupation-based communities” were targets of discrimination by fellow Muslims.

    The caste system, existing for over three thousand years, is the oldest socioeconomic system of stratification in the Indian subcontinent. Even ostensibly “casteless” religions like Islam have not been spared from its influence. As a birth-based hierarchical social order, also known as the Varna system—in which the priestly Brahmins occupy the highest position, followed by Kshatriyas, Vaishyas, and Shudras—the logic of caste extends to almost all communities and faiths in India. Given its endogamous nature, it is not simply a social practice but also an economic one, with each distinct Jati (a sub-group within a Varna) designated a certain occupation, making occupational roles hereditary. Forest-dwelling indigenous communities, Dalits (against whom untouchability has been perpetrated historically), and nomadic tribes continue to be at the receiving end of this system. The Laws of Manu, or Manusmriti, one of the earliest (divine) legal codes congealing the caste system, ascribe differential punishments to various lower-caste communities, tribes, and women for its violation, thereby locating the root of carcerality within the caste order. In other words, to maintain and reproduce this hierarchical system, the Hindu social order has fundamentally relied on elaborate modes of policing and punishment. These caste-based prescriptions were later given their modern legal form by the British colonial administration.

    In the early 1800s, the British administration was tasked with crime control as a necessity to their goals of commercial trade, which facilitated the establishment of a “Thuggee and Dacoity Department.” The department treated “thugs”—those who committed highway robberies—as a “criminal class,” conceptualized as a hereditary occupation, like other caste-based occupations prevalent on the subcontinent. By the 1850s, the colonial government had imposed methods of classification, surveillance, and policing to monitor groups that they termed “criminal communities’’—categories established through the consolidation of information on their beliefs, language, culture, and movements. Contemporary scholarship has traced how this form of colonial anthropology emerged in conversation with existing Brahminical ideology, ultimately paving the way for the Criminal Tribes Act, 1871 (CTA), which identified “criminal tribes” who were “addicted” to crime. At the time, Hyderabad was a princely state, which meant that it was not directly under British rule, though British-occupied territories surrounded it. Nevertheless, in 1896, Hyderabad was compelled to promulgate its own version of the CTA due to the labeling of the walled city-state as a geography that allowed for easy movement of “dacoits”, and to the large number of Lambadas (an erstwhile-criminalized tribe), who had attained some sociocultural capital yet were seen as a threat.

    A key aim of the CTA was to surveil, resettle, and sedentarize nomadic tribes. In order to confine them to specific villages and reformatories, its surveillance mechanisms pervaded every aspect of nomadic tribes’ social lives. They were mandated to register with local police stations, and to face fines and penalties. The British administration employed discriminatory anthropometric techniques, including physical measurements, roll calls to track their presence, and later, as technology advanced, fingerprinting.

    In 1952, following independence from the British Empire, India repealed the CTA. The formerly criminalized communities came to be known as Denotified Tribes (or DNTs). However, this would not be an end to the surveillance that had shadowed DNTs for so long. A parallel story was unfolding in 1948 when the Nizam of Hyderabad refused to integrate with the Indian Union, prompting military action (or "Police Action”) by the Indian state. The Arya Samaj’s (a reformist Hindu organization) communal propaganda characterizing the excesses of the Muslim paramilitary force in Hyderabad as the wholesale rot of Muslim society, offered ample cover. Under the garb of collapsing law and order and supporting “oppressed Hindus”, the Union’s annexation resulted in mass killings, sexual violence, and the looting of Muslim homes.
    Muslims Citizenship in Modern Hyderabad

    Following the annexation of Hyderabad, the relationship between Muslims and the police was marked by suspicion—a dynamic that persists to this day. Among human rights scholars of India, the anti-Muslim disposition of policing is now a matter of consensus. At the same time, the heterogeneity of Muslims has often been overlooked. Shaik Yousuf Baba, aka “Sky Baba,” a writer from Telangana, points out that within Muslims, conversations about caste rarely occur, even as many Islamic communities are intricately intertwined with caste. Sonar and Hajam Muslims, for instance, are engaged in goldsmithing and hairdressing (caste-based occupations associated with lower-castes).

    Formerly criminalized communities, too, grappled with the blurry boundaries of caste and religion, especially as a politically underrepresented group. As the Renke Commission report on nomadic and denotified tribes notes, DNTs practiced varying faiths, including Hinduism, Islam, and Sikhism. M. Subba Rao, convener of the DNT Political Front told us,

    Whoever [Dalits and backward castes] converted from these Hindu castes … they are still practicing the old traditional occupations and livelihoods. Though they are Muslims … many of them are stone-cutters. Fakirs [nomadic Muslims] go door to door and to these shops and put dhoops and get money.

    Further, it is evident that Muslims, particularly lower-caste Muslims, are ghettoized even in Hyderabad, where they form a huge population. This puts them at the bottom of what historian Radha Kumar calls the “hierarchy of spaces”, witnessing policing as an intimate part of their everyday lives.

    Consider “Mission Chabutra,” launched in 2015 by the Hyderabad police, which resulted in stop-and-search operations in the Old City, where a large number of Muslims live. Under this mission, “wandering youth” who are found “loitering” in their neighborhoods at night are forced to scatter after surrendering their fingerprints, which are processed through portable fingerprint scanners. The fingerprints are matched against the police department’s existing records for pending warrants or criminal records. In a few instances, these police actions also involve religious leaders in the community to counsel young men to give up their “wayward” ways.

    In their regulation of access to spaces and deployment of a casteist trope of “decency,” the police enforce social discipline in the name of public propriety and perpetuate the legacy of the CTA. Even after the CTA was repealed, several Indian states enacted new laws criminalizing “habitual offenders” instead of “criminal tribes”, upholding the same Brahminical social norms. These laws are still in operation in ten states, including in Telangana.
    How Oppressed Castes Become “Habitual Offenders”

    On a scorching afternoon in April, we visited an award-winning police station in Hyderabad. An engineer-turned-police officer walked us through the different sections of the station: a reception, an administrative desk (which generates First Information Reports (FIRs) from initial complaints), a women’s help desk, a lockup, a firearms storeroom. Adorning the walls of the station were motivational posters about controlling one’s anger and ego, and the importance of punctuality. While detailing to us the intensive surveillance of “habitual offenders,” the officer showed us on his phone an app that had recorded a photo taken of one such individual, clicked at 2:00 a.m. outside their house on a routine weekday. Upon expressing surprise, he proudly remarked, “It’s like they are living in our custody all the time.” He went on to show us how they checked in with each “habitual offender” every night and took pictures with them on e-tablets, marking where they were physically present.

    In Telangana today, a “habitual offender” is any adult who has been convicted thrice in any consecutive five-year period. Existing records of habitual criminals prior to 1962 were carried over to this legislation, marking a clear continuation of the CTA. Though records under the Habitual Offenders Act, 1962 (carried over from the state of Andhra Pradesh after the latter split), are to be maintained only for a period of five years, this “registration” may be extended for another five years. This can result in individuals being summoned to police stations at any time, having their physical mobility restricted, and even being sent to corrective settlements.

    In this context, police manuals, drafted at the state level in India, crucially reveal policing’s role as a knowledge-producing institution. Records created on the basis of these manuals are a double-edged sword: they serve both as “evidence” on marginalized communities and as a basis for decision-making on resource allocation in surveillance efforts, thereby perpetuating caste-class relations on an everyday basis. A quick examination of the Andhra Pradesh Manual (also adopted by Telangana) reveals various methods through which surveillance of “habitual offenders” has been expanded (beyond the scope of the above legislation) and refined to monitor any behavior deemed suspicious, spanning from the local police station to the broader state-level intelligence departments. Under a broad category of “law and order maintenance,” the manual authorizes the preservation of various registers according to the nature of the alleged offence and background of the offender. Based on this classification, an individual may be treated as “casual” or “professional” offenders and entered into documents called “Suspect sheets”, “Rowdy sheets”, “Bad Character Rolls” or “History sheets,” triggering surveillance of varying kinds from police station to police station, district to district, or state to state. Moreover, if an alleged first-time offender belongs to a family or group with a criminal history or is even an associate of a “habitual offender,” they are automatically classified as a “professional offender.” Even minor offences like pickpocketing or “thuggery” are linked to “professional” criminals due to historical associations with specific castes and tribes. A history sheet is created for them, mandating daily check-ins with law enforcement officers for professional offenders. In such cases, beat constables are encouraged to conduct nighttime checks on these “history-sheeters” and maintain daily red ink entries in the General Diary, which record all police station incidents. This information is shared within the police hierarchy, including neighboring stations, supervisors at the city or district level, and the District Crime Records Bureau, which compiles crime statistics for state and national records. There is virtually no known bar on the extent of information sharing of these databases, which encompasses details of an individual’s physical and behavioral traits, family members and associates, and even any changes in their personal life.

    This discretionary classificatory paradigm of “habitual offenders” (and enumeration under different but overlapping categories of “professional offenders,” “history sheeters,” and so on) clearly continues to reproduce the casteist construction of criminality, albeit in a more palatable administrative form. A family belonging to the Yerukula tribe, a DNT community, told us how the members of their family were routinely picked up for questioning and extorted so as not to be booked on false charges. If one of them needed to travel, whether across district or state lines, they would have to notify the local police station. They added that the police held copies of their personal identification documents, call records, and biometric data. Per the police manual, police must maintain these records at all outposts, share with other states, and collate them weekly through the Criminal Intelligence Gazette, further assembled by the state investigation agencies and privately circulated across the country every month.

    The Telangana government’s 2018 Comprehensive Criminal Survey was the first of its kind to enumerate all “repeat” and “professional offenders.” It collected fingerprints and photographs, geotagged the location of alleged offenders’ residence, and compiled details of their family trees along with their phone numbers. Even those that had been acquitted of a crime due to lack of evidence were included. With the recent federal legislation of the Criminal Procedure Identification Act, 2022, the police have now been empowered to store and share this data at a federal level, with a lengthy shelf life of seventy-five years. The arrival of Big Data surveillance technology and their reputation for reducing paperwork has lent itself well to form an unending continuum around historically criminalized communities, such as DNTs, as objects of policing.
    Old Wine, New Bottle: Digitizing Everyday Forms of Visible Policing

    The provisions within the Criminal Tribes Act and Habitual Offenders Act demonstrate that the Telangana police have historically gathered data on the movements, associations, and activities of specific communities branded as criminals, employing a combination of imprecise classification systems. As we have shown, this abiding interest has been perpetuated through modern surveillance methods leveraging the perpetually dynamic nature of technology, resulting in the escalation of digitized policing practices. These include the digitization and expansion of existing criminal databases, while relying on the myth of technocratic objectivity. This ideological framework has enabled states to develop more intrusive policing techniques for monitoring so-called habitual offenders. The legacy of the CTA now finds expression as “data,” forming the foundation for training digitized surveillance systems.

    As a newly minted state, Telangana adopted its neoliberal-friendly Silicon Valley dreams to reform the institution of policing. It quickly innovated sophisticated software that allowed one to search databases of state records, using facial and fingerprint recognition and prior antecedents, if any. One could now search this history through a phone number, state-issued identification such as voter’s registration or driver’s license, and national biometric identification registries like Aadhar. This infrastructural capacity was built by two relatively unknown, recently incorporated companies: WinC and Tecdatum. Our conversations with tech researchers in the city raises eyebrows about how contracts for constructing this surveillance architecture for the state (based on example technologies from Singapore, New York, Israel) were awarded to companies with no proven record—questions which remain fertile ground for further research.

    The use of technology has the effect of digitally encoding and hyper-visibilizing everyday forms of policing, while positing digitization as the “invention” of new forms of policing. This is most prominently seen in the use of crime analytics to identify “crime hotspots,” along with the deployment of traditional policing methods such as the use of informants and patrolling. The creation of informants and patrolling both constituting and being contingent upon the creation of “crime hotspots”.

    As we have shown above, the process of delineating “criminal spaces” is synonymous with casteist segregation. Correspondingly, the identification of “crime hotspots” forms the essence of predictive policing and is a common strategy deployed in large cities like Delhi. The creation of crime hotspots also dovetails with the concentration of patrolling resources in such “hotspots.” Through biometrically-oriented data-driven patrolling, the police effectively position themselves in popular imagination as providers of swift justice. Our analysis of Hyderabad illustrates how the reproduction of order through patrolling is deeply imbricated with the police’s crime-control mission.

    In Hyderabad, the T-S Cop portal acts as a fulcrum for the creation of these “criminal hotspots.” Its main analytic tool is a software system called KiteEye Interactive Mapping of Crimes and Accidents for Police, which tracks where alleged crimes have been registered, the nature of the response, and the number of police officers who responded. Essentially, every time a distress call is made, an accident takes place, or an incident is registered, it is geotagged. At the same time, police officers’ patrol vehicles themselves are also tracked geospatially using GPS chips. A geographic information system (GIS) breaks down different data points, such as patrolling, by different types of vehicles or different categories of incident, and filters details by individual police stations. At the broadest level, through algorithmic plotting of legacy data, the KiteEye generates reports showing bigger “crime hotspots” with different colors, along with smaller scatterplots of crime categories. Across all categories, without exception, “repeat offenders” and their movements are key targets on the software. Leveling its panoptic gaze at neighborhoods selected for discipline and punishment, the software suite thus deploys geographic positioning to attribute the taint of being branded a criminal to the physical spaces where DNT communities live and work.

    Another important function of the KiteEye is the collation of patrol data against crime occurrence, which enables it to layer “maps” of different indices to assess whether patrol vehicles frequent areas from which distress calls are frequently received. Given the nature of mapping the occurrence of crime, this has a chilling effect in how those belonging to DNT communities access or navigate public spaces. Their very presence is datafied as an alert to policing systems as a “call to action.” Thus, the digitization of a crucial function of everyday policing continues to draw upon the encoded logic of the caste, ensuring the “polluted” do not “pollute more.”

    The completion of this trifecta of “predictive policing” is brought to fruition through a system of informants. Policing has historically relied on a network of “reliable informants” or makhbars or mukhbirs, predominantly from oppressed caste communities. An app called Dragnet has been designed to synthesize informants’ tips and to help enter these into records that relate to “sensitive” persons, organizations, rallies, and meetings of interest. A more curious kind of information aggregated and analyzed in the app are the “tips” received from the Special Branches of the police. Dragnet is used exclusively by officers of a Special Branch and other intelligence division to enter details on any movement, new information, new associations, or personal details on those being tracked, and the corresponding “threat perception” level. One can search for an individual’s name and view all the tips ever recorded about them and connect them to mapping of hotspots. The deployment of data to create crime hotspots, along with patrolling and the use of informants, contributes to the recasting of criminalized communities as “digitized criminal communities.”

    At its core, the term “predictive policing” is a misnomer. It hinges on the audacious premise that policing, however digitized, can transcend its intrinsic biases to foresee future events with unfettered objectivity. Our inquiry substantiates that, in practice, policing operates fundamentally on presumptions rather than on “neutral” predictions. Dwelling upon the red herring of prediction has the effect of obfuscating the history of policing as caste. And to go by Telangana’s example, there is nothing stopping the police from combining ostensibly invisible, predictive technologies considered to be “friendly and smart” with traditional forms of “visible policing”. Advocates of visible policing, who often describe legal safeguards as impediments to justice, are now working closely with private-sector data analytics firms to circumvent settled criminal procedures and “do justice.” A senior police officer candidly described to us the legal safeguards against arrest, especially for minor offenses, as interfering with delivery of justice to victims, for whom the police are the only “savior”. He also noted how younger officers, who now work more closely with technology, were less observant of due process.
    Unify and Punish

    Digital data is the crux of the project of everyday casteist policing in India. While the collection of data by local police stations, beginning with the creation of criminal registers and their evolution into digital forms, has historically been rooted in localized agendas of policing, there is now a push for making policing a nationwide project. In 2009, India’s federal government unveiled a plan to roll out a larger public initiative to create, by 2013 (though ultimately delayed well into 2017), an IT-based networked infrastructure for police stations: the Crime and Criminals Tracking System (CCTNS). The National Crime Records Bureau (NCRB), the nodal agency for aggregating crime statistics in India, was sanctioned as the implementing agency to enable the “real-time investigation of crime and detection of criminals” for roughly fifteen thousand police stations and five thousand higher-level police offices across the country.

    The stated purpose of the CCTNS, which centralized crime data such as arrest records, charge sheets, and first information reports from across the country, is to improve the speed of criminal investigations and prevent crime. However, at the same time, larger centralized repositories of fingerprints and facial-recognition databases continue to be populated alongside these and are linked within the CCTNS database. Even as we grapple with the scale of predictive policing made possible by the CCTNS, a newer concern has arrived: the Interoperable Criminal Justice System (ICJS), which connects e-courts, e-prisons, forensics labs, and the CCTNS. The ICJS goes beyond simple interlinking of details of alleged offenders outside the police department. Not only does the infrastructure track an individual through the criminal justice system; it also records details of their associates and anyone else who interfaces with the criminal justice system, even incidentally. Much of this digitization is predicated on the faith that data analytics will be able to map “hotspots” and predict crime, thus helping reduce or control it. As we have shown above, Hyderabad serves as an excellent example of such interlinkages, and it illustrates the dangers of a common legitimizing assumption: namely, that the data upon which they rely is collected and analyzed in a neutral manner. These perils are made clearer when we look at the process in which such data comes to enter large machine-learning systems in the first place.

    Telangana is at the forefront of India’s technological transformation of policing, but it has already spurred admirers and comparable systems in the country’s other states. A nebulous legal framework around the use of technology for policing and data collection has contributed to the ubiquity of its use among criminalized communities, setting in motion the creation of a digital caste panopticon.
    Building a Digital Caste Panopticon

    Digital data-aided policing enables the encoding of casteist carceral spatiality by delineating “criminal spaces,” which are identified as hotspots and sites for predictive policing. The evolution of the carceral into the realm of the digital entrenches and obfuscates the legacies of carcerality through the myth of tech neutrality. By focusing on surveillance tied to biometric classification technology and its portents for casteist policing in India, we have demonstrated how carcerality is not simply confined to designated carceral institutions. Rather, techniques and technologies of confinement and surveillance seep out of “carceral” spaces into everyday domestic, street, and institutional spaces shaped by caste. More importantly, as this piece has argued, historically and today, carcerality in India is synonymous with Brahminical caste order. Recognizing the link between the two will allow us to do the urgent work of accurately naming, unpacking, and challenging this everyday digitized criminalization for what it is: the building blocks of a digital caste panopticon.

    The authors would like to thank Afrah Asif, Deependra Sori, Haripriya Anthagiri, Priya Chaudhary, Sakshi Rai, Sucharita Kanjilal, and Srujana Bej for their support with the article.

    All authors are members of the Criminal Justice and Police Accountability Project (CPA Project), India.

    Hindutva
    https://en.m.wikipedia.org/wiki/Hindutva

    Hindutva (lit. ’Hindu-ness’) is a political ideology encompassing the cultural justification of Hindu nationalism and the belief in establishing Hindu hegemony within India. The political ideology was formulated by Vinayak Damodar Savarkar in 1922. It is used by the Rashtriya Swayamsevak Sangh (RSS), the Vishva Hindu Parishad (VHP), the Bharatiya Janata Party (BJP) and other organisations, collectively called the Sangh Parivar.

    Inspired by European fascism,[9][10] the Hindutva movement has been described as a variant of right-wing extremism,[11] and as “almost fascist in the classical sense”, adhering to a concept of homogenised majority and cultural hegemony.[12][13] Some have also described Hindutva as a separatist ideology.[14][15] Some analysts dispute the identification of Hindutva with fascism, and suggest Hindutva is an extreme form of conservatism or “ethnic absolutism”.

  • Origin Stories: Plantations, Computers, and Industrial Control
    https://logicmag.io/supa-dupa-skies/origin-stories-plantations-computers-and-industrial-control
    https://images.ctfassets.net/e529ilab8frl/5bIaeR1Inlk5WHbyIrsm3d/8714bc0fe842ccb848a4de9989c2da15/12023550816_c21d70fff8_k.jpg?w=1200&fm=jpg&fl=progressive

    The blueprint for modern digital computing was codesigned by Charles Babbage, a vocal champion for the concerns of the emerging industrial capitalist class who condemned organized workers and viewed democracy and capitalism as incompatible. Histories of Babbage diverge sharply in their emphasis. His influential theories on how “enterprising capitalists” could best subjugate workers are well documented in conventional labor scholarship. However, these are oddly absent from many mainstream accounts of his foundational contributions to digital computing, which he made with mathematician Ada Lovelace in the nineteenth century.1 Reading these histories together, we find that Babbage’s proto-Taylorist ideas on how to discipline workers are inextricably connected to the calculating engines he spent his life attempting to build.

    • Bien qu’ils soient programmables, les calculateurs de Babbage ne sont pas des ordinateurs. Pas du fait de leur technologie archaïque (mécanique au lieu d’électronique) mais parce que les machines de Babbage ne peuvent pas accéder au domaine du calculable formalisé par Turing et qui définit plus fondamentalement ce qui constitue les machines que nous mettons en œuvre depuis la fin de la Seconde Guerre mondiale. Les conséquences du parcours du calculable au sens de Turing sont bien plus déterminantes que les rêves de contrôle que Babbage voulait inscrire dans ses machines. (ça ne veut pas dire que Turing serait le grand organisateur des nuisances du monde numérique, pas plus que Babbage).
      Pour percevoir (et critiquer) le caractère déterminant des ordinateurs (et leur spécificité vis-à-vis de toutes les lignés de calculateurs dont ils se détachent radicalement), il faut aussi partir d’une analyse radicale du capitalisme qui ne se réduit pas à des phénomènes dérivés tels que l’exploitation ou la division du travail et son contrôle. Ces phénomènes existent bien et les luttes qui s’y confrontent sont indispensables, mais ils ne sont que les conséquences d’un noyau plus fondamental qui reste impensé dans les approches critiques STS du numérique.

  • The Madness of the Crowd
    https://logicmag.io/intelligence/the-madness-of-the-crowd
    https://images.ctfassets.net/e529ilab8frl/48aUxBW79ZcMEk2CfmsjBs/d5c5fd43b181e225369a0032bc119f07/rob-curran-sUXXO3xPBYo-unsplash.jpg?w=1200&fm=jpg&fl=progressive

    Par Tim Hwang (mars 2017)

    As the Trump Administration enters its first hundred days, the 2016 election and its unexpected result remains a central topic of discussion among journalists, researchers, and the public at large.

    It is notable the degree to which Trump’s victory has propelled a broader, wholesale evaluation of the defects of the modern media ecosystem. Whether it is “fake news,” the influence of “filter bubbles,” or the online emergence of the “alt-right,” the internet has been cast as a familiar villain: enabling and empowering extreme views, and producing a “post-fact” society.

    This isn’t the first time that the internet has figured prominently in a presidential win. Among commentators on the left, the collective pessimism about the technological forces powering Trump’s 2016 victory are matched in mirror image by the collective optimism about the technological forces driving Obama’s 2008 victory. As Arianna Huffington put it simply then, “Were it not for the Internet, Barack Obama would not be president. Were it not for the Internet, Barack Obama would not have been the nominee.”

    But whereas Obama was seen as a sign that the new media ecosystem wrought by the internet was functioning beautifully (one commentator praised it as “a perfect medium for genuine grass-roots political movements”), the Trump win has been blamed on a media ecosystem in deep failure mode. We could chalk these accounts up to simple partisanship, but that would ignore a whole constellation of other incidents that should raise real concerns about the weaknesses of the public sphere that the contemporary internet has established.

    This troubled internet has been around for years. Fears about filter bubbles facilitating the rise of the alt-right can and should be linked to existing concerns about the forces producing insular, extreme communities like the ones driving the Gamergate controversy. Fears about the impotence of facts in political debate match existing frustrations about the inability for documentary evidence in police killings—widely distributed through social media—to produce real change. Similarly, fears about organized mobs of Trump supporters systematically silencing political opponents online are just the latest data point in a long-standing critique of the failure of social media platforms to halt harassment.

    One critical anchor point is the centrality of the wisdom of the crowd to the intellectual firmament of Web 2.0: the idea that the broad freedom to communicate enabled by the internet tends to produce beneficial outcomes for society. This position celebrated user-generated content, encouraged platforms for collective participation, and advocated the openness of data.

    Inspired by the success of projects like the open-source operating system Linux and the explosion of platforms like Wikipedia, a generation of internet commentators espoused the benefits of crowd-sourced problem-solving. Anthony D. Williams and Don Tapscott’s Wikinomics (2006) touted the economic potential of the crowd. Clay Shirky’s Here Comes Everybody (2008) highlighted how open systems powered by volunteer contributions could create social change. Yochai Benkler’s The Wealth of Networks (2006) posited a cooperative form of socioeconomic production unleashed by the structure of the open web called “commons-based peer production.”

    Such notions inspired movements like “Gov 2.0” and projects like the Sunlight Foundation, which sought to publish government data in order to reduce corruption and enable the creation of valuable new services by third parties. It also inspired a range of citizen journalism projects, empowering a new fourth estate.

    Intelligence Failure
    The platforms inspired by the “wisdom of the crowd” represented an experiment. They tested the hypothesis that large groups of people can self-organize to produce knowledge effectively and ultimately arrive at positive outcomes.

    In recent years, however, a number of underlying assumptions in this framework have been challenged, as these platforms have increasingly produced outcomes quite opposite to what their designers had in mind. With the benefit of hindsight, we can start to diagnose why. In particular, there have been four major “divergences” between how the vision of the wisdom of the crowd optimistically predicted people would act online and how they actually behaved.

    First, the wisdom of the crowd assumes that each member of the crowd will sift through information to make independent observations and contributions. If not, it hopes that at least a majority will, such that a competitive marketplace of ideas will be able to arrive at the best result.

    Second, collective intelligence requires aggregating many individual observations. To that end, it assumes a sufficient diversity of viewpoints. However, open platforms did not generate or actively cultivate this kind of diversity, instead more passively relying on the ostensible availability of these tools to all.

    Third, collective intelligence assumes that wrong information will be systematically weeded out as it conflicts with the mass of observations being made by others. Quite the opposite played out in practice, as it ended up being much easier to share information than to evaluate its accuracy. Hoaxes spread very effectively through the crowd, from bogus medical beliefs and conspiracy theories to faked celebrity deaths and clickbait headlines.

    Fourth, collective intelligence was assumed to be a vehicle for positive social change because broad participation would make wrongdoing more difficult to hide. Though this latter point turned out to be arguably true, transparency alone was not the powerful disinfectant it was assumed to be.

    The ability to capture police violence on smartphones did not result in increased convictions or changes to the underlying policies of law enforcement. The Edward Snowden revelations failed to produce substantial surveillance reform in the United States. The leak of Donald Trump’s Access Hollywood recording failed to change the political momentum of the 2016 election. And so on. As Aaron Swartz warned us in 2009, “reality doesn’t live in the databases.”

    Ultimately, the aspirations of collective intelligence underlying a generation of online platforms proved far more narrow and limited in practice. The wisdom of the crowd turned out to be susceptible to the influence of recommendation algorithms, the designs of bad actors, in-built biases of users, and the strength of incumbent institutions, among other forces.

    The resulting ecosystem feels deeply out of control. The promise of a collective search for the truth gave way to a pernicious ecosystem of fake news. The promise of a broad participatory culture gave way to campaigns of harassment and atomized, deeply insular communities. The promise of greater public accountability gave way to waves of outrage with little real change. Trump 2016 and Obama 2008 are through-the-looking-glass versions of one another, with the benefits from one era giving rise to the failures of the next.

    To the extent that the vision of the wisdom of the crowd was naive, it was naive because it assumed that the internet was a spontaneous reactor for a certain kind of collective behavior. It mistook what should have been an agenda, a ongoing program for the design of the web, for the way things already were. It assumed users had the time and education to contribute and evaluate scads of information. It assumed a level of class, race, and gender diversity in online participation that never materialized. It assumed a gentility of collaboration and discussion among people that only ever existed in certain contexts. It assumed that the simple revelation of facts would produce social change.

    In short, the wisdom of the crowd didn’t describe where we were, so much as paint a picture of where we should have been going.

    The vision of collective participation embedded in the idea of the wisdom of the crowd rests on the belief in the unique potential of the web and what it might achieve. Even as the technology evolves, that vision—and a renewed defense of it—must guide us as we enter the next decade.

    #Tim_Hwang #Mythes_internet #Sagesse_des_foules #Intelligence_collective

  • Built to Last
    https://logicmag.io/care/built-to-last

    When overwhelmed unemployment insurance systems malfunctioned during the pandemic, governments blamed the sixty-year-old programming language COBOL. But what really failed ? At the time of this writing, in July 2020, the COVID-19 pandemic has killed over 133,000 people in the United States. The dead are disproportionately Black and Latinx people and those who were unable, or not allowed by their employers, to work remotely. During the pandemic, we’ve seen our technological infrastructures (...)

    #obsolescence #sexisme #COVID-19 #santé #technologisme

    ##santé
    https://images.ctfassets.net/e529ilab8frl/tRbxY0DnXIP6QE7RBrNB9/8ae1e1cf7b4d7530e4d3500a154001dc/mar-hicks.png

    • Excerpt :

      Many of these men fancied themselves to be a cut above the programmers who came before, and they often perceived COBOL as inferior and unattractive, in part because it did not require abstruse knowledge of underlying computer hardware or a computer science qualification.

      Consciously or not, the last thing many male computer scientists entering the field wanted was to make the field easier to enter or code easier to read, which might undermine their claims to professional and “scientific” expertise.

      In a broader sense, hating COBOL was—and is—part of a struggle between consolidating and protecting computer programmers’ professional prestige on the one hand, and making programming less opaque and more accessible on the other. There’s an old joke among programmers: “If it was hard to write, it should be hard to read.” In other words, if your code is easy to understand, maybe you and your skills aren’t all that unique or valuable. If management thinks the tools you use and the code you write could be easily learned by anyone, you are eminently replaceable.

      one contemporary programmer, who works mainly in C++ and Java at IBM, told me, “Every new programming language that comes out that makes things simpler in some way is usually made fun of by some contingent of existing programmers as making programming too easy—or they say it’s not a ‘real language.’”

      “It’s about gatekeeping, and keeping one’s prestige and importance in the face of technological advancements that make it easier to be replaced by new people with easier to use tools.” Gatekeeping is not only done by people and institutions; it’s written into programming languages themselves.

      modern computing has started to become undone, and to undo other parts of our societies, through the field’s high opinion of itself, and through the way that it concentrates power into the hands of programmers who mistake social, political, and economic problems for technical ones, often with disastrous results.

      In order to care for technological infrastructure, we need maintenance engineers, not just systems designers—and that means paying for people, not just for products.

      Older systems have value, and constantly building new technological systems for short-term profit at the expense of existing infrastructure is not progress. In fact, it is among the most regressive paths a society can take.

      The blessing and the curse of good infrastructure is that when it works, it is invisible: which means that too often, we don’t devote much care to it until it collapses.

  • Aided by #Palantir, the LAPD Uses Predictive Policing to Monitor Specific People and Neighborhoods
    https://theintercept.com/2018/05/11/predictive-policing-surveillance-los-angeles

    A new report details the Los Angeles #Police Department’s use of algorithms to identify “hot spots” and “chronic offenders” and target them for surveillance.

    Enter the Dragnet
    https://logicmag.io/commons/enter-the-dragnet

    The following is an excerpt from Predict and Surveil: Data, Discretion, and the Future of Policing by Sarah Brayne.

    via @hubertguillaud la #silicon_army en pleine forme

  • A Brief History of the Gig
    https://logicmag.io/security/a-brief-history-of-the-gig

    Where does the gig economy come from ? In early 2012, San Francisco taxi drivers began to raise the alarm at organizing meetings and city hearings about “bandit tech cabs” pilfering their fares. “I’ll sit at a hotel line, and I see one of these guys in their own car come up, hailed by some guy’s app, and they’ll turn down my fare,” Dave, who had been driving a taxi for fourteen years, said at a meeting that April. “They steal it. It’s insulting.” Other cabbies said they were seeing the same thing, (...)

    #discrimination #travail #pauvreté #lutte #GigEconomy #domination #technologisme #algorithme #Uber #Lyft #AmazonMechanicalTurk (...)

    ##pauvreté ##Amazon
    https://images.ctfassets.net/e529ilab8frl/Jbpd7qRglJZtQecM6dbe7/0dc321d7169ebb17debdc32cf92d3da4/15.png

  • Oil is the New Data
    https://logicmag.io/nature/oil-is-the-new-data

    Big Tech is forging a lucrative partnership with Big Oil, building a new carbon cloud that just might kill us all. I remember being nervous when I flew into Atyrau, Kazakhstan. Before boarding the flight, one of the business managers who organized the trip sent me a message with precise instructions on how to navigate the local airport : Once you land, get into the bus on the right side of the driver. This side opens to the terminal. Pass through immigration, pick up your luggage, and (...)

    #Apple #Chevron #Exxon_Mobil #Microsoft #Oracle #Total #algorithme #écologie #minerais #BigData #CloudComputing #Amazon #GoogleCloud #AWS #comportement #InternetOfThings #surveillance (...)

    ##travail
    https://images.ctfassets.net/e529ilab8frl/2JLIy42sPCr5MEuBgBd2Rc/bdbbe0654528219fce577feb516ab5ae/zerocool.jpg

  • The Messy Truth About Social Credit | Shazeda Ahmed, Logic (01/05/2019) via @oliviertesquet
    https://logicmag.io/china/the-messy-truth-about-social-credit

    (…) In some instances, blacklists are adapting to new media while retaining their original function of shaming people into changing their behavior. The enormously popular social video streaming app TikTok (抖音, douyin) has partnered with a local court in Nanning, Guangxi to display photographs of blacklisted people as advertisements between videos, in some cases offering reward payments for information about these people’s whereabouts that are a percentage of the amount of money the person owes. Much like the other apps and websites that take part in these state-sponsored efforts, TikTok does not disclose in its user-facing terms of service that it works with the local government of Nanning, and potentially other cities, to publicly shame blacklisted individuals.

    Sur le système dit de « crédit social » chinois, lire le très complet « Bons et mauvais Chinois » paru dans le @mdiplo en janvier 2019.
    https://www.monde-diplomatique.fr/2019/01/RAPHAEL/59403

    • L’article met bien en évidence que la politique de crédit social menée en Chine (mais aussi, sous d’autres formes, dans bien d’autres pays) ne correspond pas au développement d’applications numériques alimentant des bases centralisées traçant tous les faits et gestes des citoyens mais, de manière plus diffuse, cherche à induire un comportement plus conforme aux nécessités de l’économie.

  • Model Metropolis
    https://logicmag.io/06-model-metropolis

    Despite all this attention, few writers looked closely at the work which sparked Wright’s interest in urban simulation in the first place. Largely forgotten now, Jay Forrester’s Urban Dynamics put forth the controversial claim that the overwhelming majority of American urban policy was not only misguided but that these policies aggravated the very problems that they were intended to solve. In place of Great Society-style welfare programs, Forrester argued that cities should take a less interventionist approach to the problems of urban poverty and blight, and instead encourage revitalization indirectly through incentives for businesses and for the professional class. Forrester’s message proved popular among conservative and libertarian writers, Nixon Administration officials, and other critics of the Great Society for its hands-off approach to urban policy. This outlook, supposedly backed up by computer models, remains highly influential among establishment pundits and policymakers today.

  • How To Kill Your Tech Industry, by Marie Hicks
    https://logicmag.io/05-how-to-kill-your-tech-industry

    In World War II, Britain invented the electronic computer. By the 1970s, its computing industry had collapsed—thanks to a labor shortage produced by sexism.

    JavaScript is for Girls, by Miriam Posner
    https://logicmag.io/01-javascript-is-for-girls

    Decades ago, men kicked women out of the programming profession just as it was taking off. Now that women are fighting their way back in, men are finding new ways to protect their status.

    Women in Computer Science: A 1983 MIT Report
    https://logicmag.io/women-in-computer-science-a-1983-mit-report

    In 1983, female computer scientists at MIT wrote a report on how sexism was pushing women out of their—historically female—field.

    These were their recommendations.

    In the age of Uber and VC sexual harassment and the Google memo... how much has changed? How many do we still need to start following?

    #informatique #sexisme

    • Merci @fil, ça donne des billes !
      Quand j’essaye d’expliquer sous le biais politique /ça me protège aussi/ comment le sexisme agit en informatique, c’est souvent pris par les hommes à titre personnel et on aboutit alors à une discussion défensive qui vise à déculpabiliser l’interlocuteur masculin et me rappelle à l’ordre du patriarcat (qui s’invite toujours sans crier gare).
      Ou bien, il y a toujours une femme qui leur a dit que cela ne la dérange pas.
      C’est comme si c’était du vent, ou pire, comme si je me retrouvais dans une situation où je dois rassurer l’autre, modifier mon discours et mon propre comportement pour aller me noyer dans le personnage féminin de la mère nourricière. J’en ai marre d’expliquer sans succès que le problème ne vient pas du comportement individuel des femmes. Il serait temps que la communauté des hommes qui constituent à plus de 80% le logiciel libre se sorte les doigts du cul pour au moins commencer à se questionner sur l’absence des femmes.

  • Letter from Shenzhen, by Xiaowei R. Wang (Logic Mag)
    https://logicmag.io/04-letter-from-shenzhen

    This is the new shanzhai. It’s open-source on hyperspeed — where creators build on each other’s work, co-opt, repurpose, and remix in a decentralized way, creating original products like a cell phone with a compass that points to Mecca (selling well in Islamic countries) and simple cell phones that have modular, replaceable parts which need little equipment to open or repair.

  • Austerity is an Algorithm
    https://logicmag.io/03-austerity-is-an-algorithm

    The Australian government recently tried to replace social services with software. What does fully automated austerity look like ? First there are the text messages. Impersonal, incessant, and devoid of context, they reveal few hints of their purpose. The language is so vague you’d be forgiven for thinking it was spam. “Message from the Probe Group regarding an urgent matter. Please call us.” Then, a deluge of phone calls—up to ten times a day, often after-hours—from an unknown mobile number. (...)

    #algorithme #surveillance #pauvreté

    ##pauvreté

  • L’austérité est un algorithme
    http://www.internetactu.net/a-lire-ailleurs/lausterite-est-un-algorithme

    L’excellent webzine Logic (@logic_magazine) revient sur le remplacement récent par le gouvernement australien de services sociaux par un logiciel. Non sans échos aux propos de Virginia Eubanks qui s’intéressait à ce phénomène aux États-Unis, l’écrivaine Gillian Terzis (@gillianterzis) nous montre à quoi ressemble l’austérité automatisée en Australie. En Australie, les (...)

    #A_lire_ailleurs #Enjeux #algorithmes #big_data #eAdministration #pauvreté #politiques_publiques #services_publics #surveillance

    • Austerity is an Algorithm, Gillian Terzis (c’est l’article initial)
      https://logicmag.io/03-austerity-is-an-algorithm

      un extrait du résumé par internet actu...

      Les gens étaient dépourvus de recours pour expliquer ou contester leur situation, comme le soulignent les milliers de témoignages récoltés sur le site NotMyDebt : 29 millions d’appels à Centerlink sont restés sans réponses en 2016 ! L’enquête a montré pourtant que la plupart de ces réclamations étaient mal calculées voire inexistantes. En fait, la méthode de calcul retenue ne parvenait notamment pas à prendre en compte les fluctuations de revenus des travailleurs occasionnels et contractuels, d’où des variations entre le calcul de revenus estimé et le niveau de droit de prestation ouvert. De simples fautes de frappe entre les noms des employeurs des deux systèmes pouvaient générer des demandes de remboursement. Le logiciel, capable de générer 20 000 réclamations automatisées par semaines semblait d’autant plus prolifique que les agences chargées de récupérer les dettes travaillaient à la commission.

      #austérité #droits_sociaux #dette