Rechercher : engelbart

  • William English, Who Helped Build the Computer Mouse, Dies at 91 - The New York Times

    William English, the engineer and researcher who helped build the first computer mouse and, in 1968, orchestrated an elaborate demonstration of the technology that foretold the computers, tablets and smartphones of today, died on July 26 in San Rafael, Calif. He was 91.

    His death, at a medical facility, was confirmed by his wife, Roberta English, who said the cause was respiratory failure.

    In the late 1950s, after leaving a career in the Navy, Mr. English joined a Northern California research lab called the Stanford Research Institute, or S.R.I. (now known as SRI International). There he met Douglas Engelbart, a fellow engineer who hoped to build a new kind of computer.

    At a time when only specialists used computers, entering and retrieving information through punched cards, typewriters and printouts, Mr. Engelbart envisioned a machine that anyone could use simply by manipulating images on a screen. It was a concept that would come to define the information age, but by his own admission Mr. Engelbart had struggled to explain his vision to others.
    ImageAt a time when only specialists used computers, entering and retrieving information through punched cards, typewriters and print-outs,
    At a time when only specialists used computers, entering and retrieving information through punched cards, typewriters and print-outs,Credit...via English family

    Mr. English, known to everyone as Bill, was one of the few who understood these ideas and who had the engineering talent, patience and social skills needed to realize them. “He was the guy who made everything happen,” said Bill Duvall, who worked alongside Mr. English during those years. “If you told him something needed to be done, he figured out how to do it.”

    After Mr. Engelbart had envisaged the computer mouse and drawn a rough sketch of it on a notepad, Mr. English built it in the mid-1960s. Housed inside a small pinewood case, the device consisted of two electrical mechanisms, called potentiometers, that tracked the movement of two small wheels as they moved across a desktop. They called it a mouse because of the way the computer’s on-screen cursor, called a CAT, seemed to chase the device’s path.

    As they were developing the system, both Mr. English and Mr. Engelbart were part of the government-funded L.S.D. tests conducted by a nearby lab called the International Foundation of Advanced Study. Both took the psychedelic as part of a sweeping effort to determine whether it could “open the mind” and foster creativity.

    Though Mr. Engelbart oversaw the NLS project, the 1968 demonstration in San Francisco was led by Mr. English, who brought both engineering and theater skills to the task. In the mid-1950s he had volunteered as a stage manager for a Bay Area theater troupe called The Actor’s Workshop.

    For the San Francisco event, he used a video projector the size of a Volkswagen Beetle (borrowed it from a nearby NASA lab) to arrange and project the live images behind Mr. Engelbart as he demonstrated NLS from the stage. He had been able to set up the wireless link that sent video between the Menlo Park computer lab and the auditorium after befriending a telephone company technician.
    Mr. English helped orchestrate an elaborate demonstration of the technology that foretold the computers, tablets and smartphones of today.
    Mr. English helped orchestrate an elaborate demonstration of the technology that foretold the computers, tablets and smartphones of today.Credit...via English family

    Three years after the demonstration, Mr. English left S.R.I. and joined a new Xerox lab called the Palo Alto Research Center, or PARC. There he helped adapt many of the NLS ideas for a new machine called the Alto, which became a template for the Apple Macintosh, the first Microsoft Windows personal computers and other internet-connected devices.

    #Histoire_numérique #Souris #Bill_English #SRI_international #Xerox_park #Mother_of_all_demo

  • Douglas Engelbart : The Mother of All Demos

    On December 9, 1968, Douglas C. Engelbart and the group of 17 researchers working with him in the Augmentation Research Center at Stanford Research Institute in Menlo Park, CA, presented a 90-minute live public demonstration of the online system, NLS, they had been working on since 1962. The public presentation was a session in the of the Fall Joint Computer Conference held at the Convention Center in San Francisco, and it was attended by about 1,000 computer professionals. This was the public debut of the computer mouse. But the mouse was only one of many innovations demonstrated that day, including hypertext, object addressing and dynamic file linking, as well as shared-screen collaboration involving two persons at different sites communicating over a network with audio and video interface.

    #histoire #informatique (pas vraiment un #film #documentaire, mais il y a des passages carrément mythiques : présentation de la souris, application liste de courses, démonstration de la position relax du clavier…)

  • Douglas Engelbart, l’inventeur de la souris, est mort

    L’ingénieur et pionnier de l’informatique Douglas Engelbart, inventeur de la souris d’ordinateur, est mort mardi 2 juillet au soir à l’âge de 88 ans à son domicile californien d’Atherton, au cœur de la Silicon Valley, a-t-on appris mercredi auprès de l’Institut portant son nom.
    Né dans l’Oregon, il s’était installé au Sud pour devenir chercheur au Stanford Research Institute, après des études d’ingénierie électrique et informatique dans les années 1950, une époque où les ordinateurs occupaient encore des pièces entières. Ses recherches ont ainsi porté sur la visioconférence, la téléconférence, le courrier électronique, les « fenêtres » et le lien hypertexte mais il est surtout connu pour avoir inventé la souris d’ordinateur.

    Dans le NYT

    In December 1968, however, he set the computing world on fire with a remarkable demonstration before more than a thousand of the world’s leading computer scientists at the Fall Joint Computer Conference in San Francisco, one of a series of national conferences in the computer field that had been held since the early 1950s. Dr. Engelbart was developing a raft of revolutionary interactive computer technologies and chose the conference as the proper moment to unveil them.

    For the event, he sat on stage in front of a mouse, a keyboard and other controls and projected the computer display onto a 22-foot-high video screen behind him. In little more than an hour, he showed how a networked, interactive computing system would allow information to be shared rapidly among collaborating scientists. He demonstrated how a mouse, which he invented just four years earlier, could be used to control a computer. He demonstrated text editing, video conferencing, hypertext and windowing.

    Doug Engelbart, who foresaw the modern computer, dies at 88 (Wired UK)

    Engelbart first revealed his creation to the rest of the world in 1968, at an event in San Francisco, about an hour’s drive north from SRI, and the unveiling, before many of the world’s leading computer scientists, has since become known as “The Mother Of All Demos” (see video below).
    In the audience that day in 1969, “shivering like mad, with a [40] degree temperature,” was a young man named Alan Kay. Kay would go on to join Xerox PARC, where he worked on the research lab’s seminal Alto computer and the groundbreaking object-oriented programming environment known as SmallTalk. He was among the few who saw the demo — and Engelbart — for what they were.

    “He was one of the very few people very early on who were able to understand not only that computers could do a lot of things that were very familiar, but that there was something new about computers that allow us to think in a very different way — in a stronger way,” Kay said during the 40th anniversary celebration.

    Ah ! SmallTalk…

    Et la vidéo (extraits) qui va bien, 9/12/1968…

  • Beacons, marketing and the neoliberal logic of space, or: The Engelbart overshoot

    There was a powerful dream that sustained (and not incidentally, justified) half a century’s inquiry into the possibilities of information technology, from Vannevar Bush to Doug Engelbart straight through to Mark Weiser. This was the dream of augmenting the individual human being with instantaneous access to all knowledge, from wherever in the world he or she happened to be standing at any given moment. As toweringly, preposterously ambitious as that goal seems when stated so baldly, it’s hard to conclude anything but that we actually did achieve that dream some time ago, at least as a robust technical proof of concept.

    We achieved that dream, and immediately set about betraying it. We betrayed it by shrouding the knowledge it was founded on in bullshit IP law, and by insisting that every interaction with it be pushed through some set of mostly invidious business logic. We betrayed it by building our otherwise astoundingly liberatory propositions around walled gardens and proprietary standards, by putting the prerogatives of rent-seeking ahead of any move to fertilize and renew the commons, and by tolerating the infestation of our informational ecology with vile, value-destroying parasites. These days technical innovators seem more likely to be lauded for devising new ways to harness and exploit people’s life energy for private gain than for the inverse.

    In fact, you and I now draw breath in a post-utopian world — a world where the tide of technical idealism has long receded from its high-water mark

  • 50 years on, we’re living the reality first shown at the “Mother of All Demos” | Ars Technica

    A half century ago, computer history took a giant leap when Douglas Engelbart—then a mid-career 43-year-old engineer at Stanford Research Institute in the heart of Silicon Valley—gave what has come to be known as the “mother of all demos.”

    On December 9, 1968 at a computer conference in San Francisco, Engelbart showed off the first inklings of numerous technologies that we all now take for granted: video conferencing, a modern desktop-style user interface, word processing, hypertext, the mouse, collaborative editing, among many others.

    Even before his famous demonstration, Engelbart outlined his vision of the future more than a half-century ago in his historic 1962 paper, “Augmenting Human Intellect: A Conceptual Framework.”

    To open the 90-minute-long presentation, Engelbart posited a question that almost seems trivial to us in the early 21st century: “If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsible—responsive—to every action you had, how much value would you derive from that?”

    Of course at that time, computers were vast behemoths that were light-years away from the pocket-sized devices that have practically become an extension of ourselves.

    #Histoire_informatique #Mother_of_all_demos #Douglas_Engelbart


    From the closing panel at the 1995 Brown/MIT Vannevar Bush Symposium, featuring Doug Engelbart, Alan Kay, Ted Nelson, and Tim Berners-Lee.

    Now, the abortion that happened after PARC was the misunderstanding of the user interface that we did for children, which was the overlapping window interface which we made as naive as absolutely we possibly could to the point of not having any workflow ideas in it, and that was taken over uncritically out into the outside world.

    So you are basically proposing some kind of information SWAT team that can move swiftly through an organization, or is going to be some sort of elite ’eizatsgroupe’ in the files. This is a very exciting and interesting concept, but how would that function organizationally?

    Alan Kay: Looking back, I think that one of the paradoxes is that we made a complete mistake when we were doing the interface at PARC because we assumed that the kids would need an easy interface because we were going to try and teach them to program and stuff like that, but in fact they are the ones who are willing to put hours into getting really expert at things - shooting baskets, learning to hit baseballs, learning to ride bikes, and now on video games. I have a four-year old nephew who is really incredible and he could use NLS fantastically if it were available. He would be flying through that stuff, because his whole thing is to become part of the system he’s interacting with. So if I had had that perspective I would have designed a completely different interface for the kids, one in which how you became expert was much more apparent than what I did. So I’m sorry for what I did.

    Doug Engelbart :
    Vannevar Bush :
    Ted Nelson :
    Alan Kay :
    Tim Berners-Lee :

    #gui #interaction #IHM #ergonomie #programmation #apprentissage #web #xerox_parc

  • « The Future of Programming » - Bret Victor

    excellente conférence sur la #programmation, retro-style

    “Why did all these ideas happen during this particular time period?”

    There may be a number of reasons.

    The story I told in the talk — “they didn’t know what they were doing, so they tried everything” — was essentially that programming at the time was in the “pre-paradigm phase”, as defined by Thomas Kuhn in The Structure of Scientific Revolutions. (...)

    But there’s another story, which has to do with funding models. Much fundamental research at the time, including Engelbart’s NLS and the Internet, was funded by #ARPA, an agency of the US Defense Department which had been given significant resources due to the cold war.

    ARPA ushered in an era of abundant funding for university projects, offering far more in terms of funding than any other research funds at the time. Where institutions such as the National Science Foundation and the Three Services Program provided funding to research programs at the level of tens of thousands of dollars, ARPA was willing to throw millions into the creation and support of promising research efforts.

    Part of what made ARPA funding so successful was that its directors (such as J.C.R. Licklider and Bob Taylor) were free to aggressively seek out and fund promising individuals with “no strings attached”.

    #recherche #silicon_army

    Avec le même titre, mais une approche différente, un autre article très intéressant sorti le même jour :

    The future of programming - O’Reilly Radar

    it’s worth examining the house of cards we’re building with our current approach to software development. The problem is simple: the brain can only fit so much inside it. To be a programmer today, you need to be able to execute the program you’re writing inside your head.

    When the problem space gets too big, our reaction is to write a framework that makes the problem space smaller again. And so we have operating systems that run on top of CPUs. Libraries and user interfaces that run on top of operating systems. Application frameworks that run on top of those libraries. Web browsers that run on top of those. JavaScript that runs on top of browsers. JavaScript libraries that run on top of JavaScript. And we know it won’t stop there.

    We’re like ambitious waiters stacking one teacup on top of the other. Right now, it looks pretty wobbly. We’re making faster and more powerful CPUs, but getting the same kind of subjective application performance that we did a decade ago. Security holes emerge in frameworks that put large numbers of systems at risk.

  • L’interface n’est pas une représentation

    Un texte intitulé De l’interface à la surface : Sur la conception représentationnaliste de l’interface, et présenté lors du séminaire Co-design d’Annie Gentès en juin 2011

    Son auteur, sur touitère :

    Un vieux texte qui pourra intéresser ceux qui aiment les interfaces et les horizons phénoménologiques.

    L’approche est rigolote :

    Ce document a pour objectif d’exclure deux façons classiques d’en parler : celle qui ramène l’interface à des problèmes d’ingénierie, celle qui ramène l’interface à des problèmes de communication.

    @louije : J’imagine que tu as lu les articles de Bret Victor ("The Ladder of abstraction" et « Magic Ink »), tu en penses quoi ? Tu les articule comment avec ton propos ?

    #interface #IHM #approche #ingénierie #communication #créativité

    • La référence à la machine disparaît sitôt que l’interface prend pour objet l’activité de l’utilisateur, et non le fonctionnement des appareils nécessaires à cette activité. On cesse de prendre l’interface comme un outil, comme un moyen, comme un intermédiaire. On peut passer aux choses sérieuses.

      c’est un argument que combattait Neal Stephenson dans “In the beginning was the command line" , que vous invite à lire ! J’aurais aimé qu’il soit édité en français, mais ça n’a pas (encore) pu se faire.

      cc : @louije

    • @Fil L’ordinateur qui devient invisible et qui perd sa dimension vraiment informatique, c’est une question difficile pour ceux qui ont appris à penser avec l’informatique des années 80 et 90. Il y a une vraie pertinence à comprendre comment marche un ordinateur. Mais avec tant d’activité dans le domaine de « l’après PC », type interfaces contrôlées et orientées applications comme l’iPad, c’est dommage qu’il y ait si peu d’efforts pour aller dans l’autre sens aussi, pour retrouver un peu l’idéal d’Engelbart.

      C’est marrant de voir que cet article qui voudrait réveiller l’innovation dans le PC (plutôt que dans les tablettes, etc.) ne propose que des choses dont on parlait déjà en 1995 :

    • @0gust1 Bret Victor ♥. Je ne comprends pas pourquoi il n’a pas déjà un atelier type Renaissance, où toutes ses idées sont notées, réalisées, et exposées dans tous les salons du monde. Son approche me semble un peu héritée de Papert et Piaget : rendre concrets des idées et des processus, pour empêcher le lecteur / l’élève d’avoir à faire preuve d’abstraction pour comprendre quelque chose. (Cf.

    • @Fil Je partage de plus en plus ce genre d’idées, par contre on/tu parle(s) là plus des OS, qui sont quand même des interfaces assez particulières : elles ont le public le plus large (de l’utilisateur néophyte « qui veut juste faire ... » au geek développeur) Si la machine ne transparaît pas assez dans une partie des interfaces d’un OS, il est moisi (du point de vue d’un manipulateur de machines, dev, informaticien).

      Merci pour le lien vers l’essai de Stephenson, je vais me le lire avec intérêt.

      Par contre, il dit lui même :

      The essay was written before the advent of Mac OS X. In a Slashdot interview in 2004, he remarked:
      I embraced OS X as soon as it was available and have never looked back. So a lot of In the Beginning...was the Command Line is now obsolete. I keep meaning to update it, but if I’m honest with myself, I have to say this is unlikely.

      Ce que j’aime dans le papier de @louije, c’est la posture qui s’en dégage, similaire à celles que promeut Bret Victor : Back to basics, on évacue (pour l’instant) l’ingénierie et son corollaire la communication (comment traduire la machine pour le pôvre utilisateur derrière) pour se concentrer sur la tâche et les actions (comportements manipulatoires, boucles perceptions / action) de l’utilisateur. C’est particulièrement important je pense dans les domaines de la pédagogie (les exemples de Bret Victor), des logiciels de création, etc.. les domaines où c’est vraiment important d’avoir très peu d’abstraction entre l’action et la rétroaction.
      Par contre, je pense qu’il y a des contextes où la machine doit bien être là, et les abstractions intermédiaires doivent être correctement sensibles.

      @louije : C’est un peu ce qu’il se passe, avec internet. Les gens reproduisent et s’inspire de ses travaux. De mémoire il y a un éditeur de code (IDE), basé sur ses idées exprimées lors d’une conf, qui est en cours de réalisation :
      Sur Papert et Piaget, Bret Victor s’en réclame explicitement dans plusieurs endroit de sa production (il cite notamment Mindstorm dans sa biblio de trucs à lire absolument).

    • @0gust1 Oui, c’est clair que ce monsieur connaît très bien les textes. Ça fait toujours plaisir de voir des inventeurs s’appuyer sur la littérature et rendre concrètes des théories.

      Sur Stephenson et Mac OS X, en revanche, je suis plus circonspect. À la limite, je trouve OS X moins geek que Mac OS Classic. Au fil des années, OS X a de plus en plus caché son fonctionnement à l’utilisateur (transformer la bibli iPhoto en un paquet, planquer le dossier ~/Library, toutes les magouilles bizarres pour stocker les documents iCloud, etc.), alors que sous Système 7 et suivants, avec le Dossier système, le système de fichiers était la vraie interface de gestion de l’ordi : tu enlèves un fichier, hop, la fonctionnalité disparaît.