[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
Re: Good reading - Linux in the News
|
Hi All:
In response to Joseph Sakosky posting regarding Ullman's
series of articles dealing with her encounter with Linux and
the re-discovery of thinking, I was also impressed with
these articles. I edited out the commercial nonsense, and
produced four html documents which contain the text of her
articles.
I do recommend them, it is very good reading.
Ira Clavner
clavner@inet.net Title: Salon 21st | The dumbing-down of programming
|
The
dumbing-down of programming
P
A R T_O
N E:
REBELLING
AGAINST MICROSOFT, "MY COMPUTER" AND EASY-TO-USE WIZARDS, AN ENGINEER REDISCOVERS
THE JOYS OF DIFFICULT COMPUTING.
BY
ELLEN ULLMAN
Last month
I committed an act of technical rebellion: I bought one operating system
instead of another. On the surface, this may not seem like much, since
an operating system is something that can seem inevitable. It's there when
you get your machine, some software from Microsoft, an ur-condition that
can be upgraded but not undone. Yet the world is filled with operating
systems, it turns out. And since I've always felt that a computer system
is a significant statement about our relationship to the world -- how we
organize our understanding of it, how we want to interact with what we
know, how we wish to project the whole notion of intelligence -- I suddenly
did not feel like giving in to the inevitable.
My intention had been
to buy an upgrade to Windows NT Server, which was a completely sensible
thing for me to be doing. A nice, clean, up-to-date system for an extra
machine was the idea, somewhere to install my clients' software; a reasonable,
professional choice in a world where Microsoft platforms are everywhere.
But somehow I left the store carrying a box of Linux from a company called
Slackware. Linux: home-brewed, hobbyist, group-hacked. UNIX-like operating
system created in 1991 by Linus Torvalds then passed around from hand to
hand like so much anti-Soviet samizdat. Noncommercial, sold on the
cheap mainly for the cost of the documentation, impracticable except perhaps
for the thrill of actually looking at the source code and utterly useless
to my life as a software engineering consultant.
But buying Linux was
no mistake. For the mere act of installing the system -- stripping down
the machine to its components, then rebuilding its capabilities one by
one -- led me to think about what has happened to the profession of programming,
and to consider how the notion of technical expertise has changed. I began
to wonder about the wages, both personal and social, of spending so much
time with a machine that has slowly absorbed into itself as many complications
as possible, so as to present us with a façade that says everything
can and should be "easy."
I began by ridding
my system of Microsoft. I came of technical age with UNIX, where I learned
with power-greedy pleasure that you could kill a system right out from
under yourself with a single command. It's almost the first thing anyone
teaches you: Run as the root user from the root directory, type in rm -r
f *, and, at the stroke of the ENTER key, gone are all the files and directories.
Recursively, each directory deleting itself once its files have been deleted,
right down to the very directory from which you entered the command: the
snake swallowing its tail. Just the knowledge that one might do such great
destruction is heady. It is the technical equivalent of suicide, yet UNIX
lets you do it anyhow. UNIX always presumes you know what you're doing.
You're the human being, after all, and it is a mere operating system. Maybe
you want to kill off your system.
But Microsoft was
determined to protect me from myself. Consumer-oriented, idiot-proofed,
covered by its pretty skin of icons and dialog boxes, Windows refused to
let me harm it. I had long ago lost my original start-up disk, the system
was too fritzed to make a new one and now it turned away my subterfuges
of DOS installation diskette, boot disks from other machines, later versions
of utilities. Can't reformat active drive. Wrong version detected.
Setup designed for systems without an operating system; operating system
detected; upgrade version required. A cascade of error messages, warnings,
beeps; a sort of sound and light show -- the Wizard of Oz lighting spectacular
fireworks to keep me from flinging back the curtain to see the short fat
bald man.
For Microsoft's self-protective
skin is really only a show, a lure to the determined engineer, a challenge
to see if you're clever enough to rip the covers off. The more it resisted
me, the more I knew I would enjoy the pleasure of deleting it.
Two hours later, I
was stripping down the system. Layer by layer it fell away. Off came Windows
NT 3.51; off came a wayward co-installation of Windows 95 where it overlaid
DOS. I said goodbye to video and sound; goodbye wallpaper; goodbye fonts
and colors and styles; goodbye windows and icons and menus and buttons
and dialogs. All the lovely graphical skins turned to so much bitwise detritus.
It had the feel of Keir Dullea turning off the keys to HAL's memory core
in the film "2001," each keyturn removing a "higher" function, HAL's voice
all the while descending into mawkish, babyish pleading. Except that I
had the sense that I was performing an exactly opposite process: I was
making my system not dumber but smarter. For now everything on the system
would be something put there by me, and in the end the system itself
would be cleaner, clearer, more knowable -- everything I associate with
the idea of "intelligent."
What I had now was
a bare machine, just the hardware and its built-in logic. No more Microsoft
muddle of operating systems. It was like hosing down your car after washing
it: the same feeling of virtuous exertion, the pleasure of the sparkling
clean machine you've just rubbed all over. Yours. Known down to the crevices.
Then, just to see what would happen, I turned on the computer. It powered
up as usual, gave two long beeps, then put up a message in large letters
on the screen:
NO ROM BASIC
What? Had I somehow
killed off my read-only memory? It doesn't matter that you tell yourself
you're an engineer and game for whatever happens. There is still a moment
of panic when things seem to go horribly wrong. I stared at the message
for a while, then calmed down: It had to be related to not having an operating
system. What else did I think could happen but something weird?
But what something
weird was this exactly? I searched the Net, found hundreds of HOW-TO FAQs
about installing Linux, thousands about uninstalling operating systems
-- endless pages of obscure factoids, strange procedures, good and bad
advice. I followed trails of links that led to interesting bits of information,
currently useless to me. Long trails that ended in dead ends, missing pages,
junk. Then, sometime about 1 in the morning, in a FAQ about Enhanced IDE,
was the answer:
8.1. Why
do I get NO ROM BASIC, SYSTEM HALTED?
This should get a
prize for the PC compatible's most obscure error message. It usually means
you haven't made the primary partition bootable ...
The earliest true-blue
PCs had a BASIC interpreter built in, just like many other home computers
those days. Even today, the Master Boot Record (MBR) code on your hard
disk jumps to the BASIC ROM if it doesn't find any active partitions. Needless
to say, there's no such thing as a BASIC ROM in today's compatibles....
I had not seen a PC with
built-in BASIC in some 16 years, yet here it still was, vestigial trace
of the interpreter, something still remembering a time when the machine
could be used to interpret and execute my entries as lines in a BASIC program.
The least and smallest thing the machine could do in the absence of all
else, its one last imperative: No operating system! Look for BASIC!
It was like happening upon some primitive survival response, a low-level
bit of hard wiring, like the mysterious built-in knowledge that lets a
blind little mouseling, newborn and helpless, find its way to the teat.
This discovery of
the trace of BASIC was somehow thrilling -- an ancient pot shard found
by mistake in the rubble of an excavation. Now I returned to the FAQs,
lost myself in digging, passed another hour in a delirium of trivia. Hex
loading addresses for devices. Mysteries of the BIOS old and new. Motherboards
certified by the company that had written my BIOS and motherboards that
were not. I learned that my motherboard was an orphan. It was made by a
Taiwanese company no longer in business; its BIOS had been left to languish,
supported by no one. And one moment after midnight on Dec. 31, 1999, it
would reset my system clock to ... 1980? What? Why 1980 and not
zero? Then I remembered: 1980 was the year of the first IBM PC. 1980 was
Year One in desktop time.
The computer was suddenly
revealed as palimpsest. The machine that is everywhere hailed as the very
incarnation of the new had revealed itself to be not so new after all,
but a series of skins, layer on layer, winding around the messy, evolving
idea of the computing machine. Under Windows was DOS; under DOS, BASIC;
and under them both the date of its origins recorded like a birth memory.
Here was the very opposite of the authoritative, all-knowing system with
its pretty screenful of icons. Here was the antidote to Microsoft's many
protections. The mere impulse toward Linux had led me into an act of desktop
archaeology. And down under all those piles of stuff, the secret was written:
We build our computers the way we build our cities -- over time, without
a plan, on top of ruins.
-----------
|
| Title: Salon 21st | The dumbing-down of programming
|
THE
DUMBING-DOWN OF PROGRAMMING | PAGE 2 OF 2
- - - - - - - - -
- - - - - - - - - - - - - - - - - - - - -
My Computer.
This is the face offered to the world by the other machines in the office.
My Computer. I've always hated this icon -- its insulting, infantilizing
tone. Even if you change the name, the damage is done: It's how you've
been encouraged to think of the system. My Computer. My Documents. Baby
names. My world, mine, mine, mine. Network Neighborhood, just like Mister
Rogers'.
On one side of me
was the Linux machine, which I'd managed to get booted from a floppy. It
sat there at a login prompt, plain characters on a black-and-white screen.
On the other side was a Windows NT system, colored little icons on a soothing
green background, a screenful of programming tools: Microsoft Visual C++,
Symantec Visual Cafe, Symantec Visual Page, Totally Hip WebPaint, Sybase
PowerBuilder, Microsoft Access, Microsoft Visual Basic -- tools for everything
from ad hoc Web-page design to corporate development to system engineering.
NT is my development platform, the place where I'm supposed to write serious
code. But sitting between my two machines -- baby-faced NT and no-nonsense
Linux -- I couldn't help thinking about all the layers I had just peeled
off the Linux box, and I began to wonder what the user-friendly NT system
was protecting me from.
Developers
get the benefit of visual layout without the hassle of having to remember
HTML code.
-- Reviewers' guide
to Microsoft J++
Templates, Wizards
and JavaBeans Libraries Make Development Fast
-- Box for Symantec's
Visual Cafe for Java
Simplify application
and applet development with numerous wizards
-- Ad for Borland's
JBuilder in the Programmer's Paradise catalog
Thanks to IntelliSense,
the Table Wizard designs the structure of your business and personal databases
for you.
-- Box for Microsoft
Access
Developers will benefit
by being able to create DHTML components without having to manually code,
or even learn, the markup language.
-- Review of J++
6.0 in PC Week, March 16, 1998.
Has custom controls
for all the major Internet protocols (Windows Sockets, FTP, Telnet, Firewall,
Socks 5.0, SMPT, POP, MIME, NNTP, Rcommands, HTTP, etc.). And you know
what? You really don't need to understand any of them to include the functionality
they offer in your program.
-- Ad for Visual
Internet Toolkit from the Distinct Corp. in the Components Paradise catalog
My programming tools
were full of wizards. Little dialog boxes waiting for me to click "Next"
and "Next" and "Finish." Click and drag and shazzam! -- thousands of lines
of working code. No need to get into the "hassle" of remembering the language.
No need to even learn it. It is a powerful siren-song lure: You
can make your program do all these wonderful and complicated things, and
you don't really need to understand.
In six clicks of a
wizard, the Microsoft C++ AppWizard steps me through the creation of an
application skeleton. The application will have a multidocument interface,
database support from SQL Server, OLE compound document support as both
server and container, docking toolbars, a status line, printer and print-preview
dialogs, 3-D controls, messaging API and Windows sockets support; and,
when my clicks are complete, it will immediately compile, build and execute.
Up pops a parent and child window, already furnished with window controls,
default menus, icons and dialogs for printing, finding, cutting and pasting,
saving and so forth. The process takes three minutes.
Of course, I could
look at the code that the Wizard has generated. Of course, I could read
carefully through the 36 generated C++ class definitions. Ideally, I would
not only read the code but also understand all the calls on the operating
system and all the references to the library of standard Windows objects
called the Microsoft Foundation Classes. Most of all, I would study them
until I knew in great detail the complexities of servers and containers,
OLE objects, interaction with relational databases, connections to a remote
data source and the intricacies of messaging -- all the functionality AppWizard
has just slurped into my program, none of it trivial.
But everything in
the environment urges me not to. What the tool encourages me to do now
is find the TODO comments in the generated code, then do a little filling
in -- constructors and initializations. Then I am to start clicking and
dragging controls onto the generated windows -- all the prefabricated text
boxes and list boxes and combo boxes and whatnot. Then I will write a little
code that hangs off each control.
In this programming
world, the writing of my code has moved away from being the central task
to become a set of appendages to the entire Microsoft system structure.
I'm a scrivener here, a filler-in of forms, a setter of properties. Why
study all that other stuff, since it already works anyway? Since my deadline
is pressing. Since the marketplace is not interested in programs that do
not work well in the entire Microsoft structure, which AppWizard has so
conveniently prebuilt for me.
This not-knowing is
a seduction. I feel myself drifting up, away from the core of what I've
known programming to be: text that talks to the system and its other software,
talk that depends on knowing the system as deeply as possible. These icons
and wizards, these prebuilt components that look like little pictures,
are obscuring the view that what lies under all these cascading windows
is only text talking to machine, and underneath it all is something still
looking for a BASIC interpreter. But the view the wizards offer is pleasant
and easy. The temptation never to know what underlies that ease is overwhelming.
It is like the relaxing passivity of television, the calming blankness
when a theater goes dark: It is the sweet allure of using.
My programming tools
have become like My Computer. The same impulse that went into the
Windows 95 user interface -- the desire to encapsulate complexity behind
a simplified set of visual representations, the desire to make me resist
opening that capsule -- is now in the tools I use to write programs for
the system. What started out as the annoying, cloying face of a consumer-oriented
system for a naive user has somehow found its way into C++. Dumbing-down
is trickling down. Not content with infantilizing the end user, the purveyors
of point-and-click seem determined to infantilize the programmer as well.
But what if you're
an experienced engineer? What if you've already learned the technology
contained in the tool, and you're ready to stop worrying about it? Maybe
letting the wizard do the work isn't a loss of knowledge but simply a form
of storage: the tool as convenient information repository.
(To be continued.)
SALON | May 12,
1998
- - - - - - - - -
- - -
Go on to Part Two
of "The
Dumbing Down of Programming," where Ellen Ullman explores why wizards
aren't merely a helpful convenience -- and how, when programmers come to
rely too much upon "easy" tools, knowledge can disappear into code.
- - - - - - - - -
- - -
Ellen
Ullman is a software engineer. She is the author of "Close
to the Machine: Technophilia and its Discontents." |
| Title: Salon 21st | The dumbing-down of programming
|
The
dumbing-down of programming
P
A R T_T
W O:
RETURNING
TO THE SOURCE. ONCE KNOWLEDGE DISAPPEARS INTO CODE, HOW DO WE RETRIEVE
IT?
BY
ELLEN ULLMAN
I used to
pass by a large computer system with the feeling that it represented the
summed-up knowledge of human beings. It reassured me to think of all those
programs as a kind of library in which our understanding of the world was
recorded in intricate and exquisite detail. I managed to hold onto this
comforting belief even in the face of 20 years in the programming business,
where I learned from the beginning what a hard time we programmers have
in maintaining our own code, let alone understanding programs written and
modified over years by untold numbers of other programmers. Programmers
come and go; the core group that once understood the issues has written
its code and moved on; new programmers have come, left their bit of understanding
in the code and moved on in turn. Eventually, no one individual or group
knows the full range of the problem behind the program, the solutions we
chose, the ones we rejected and why.
Over time, the only
representation of the original knowledge becomes the code itself, which
by now is something we can run but not exactly understand. It has become
a process, something we can operate but no longer rethink deeply. Even
if you have the source code in front of you, there are limits to what a
human reader can absorb from thousands of lines of text designed primarily
to function, not to convey meaning. When knowledge passes into code, it
changes state; like water turned to ice, it becomes a new thing, with new
properties. We use it; but in a human sense we no longer know
it.
The Year 2000 problem
is an example on a vast scale of knowledge disappearing into code. And
the soon-to-fail national air-traffic control system is but one stark instance
of how computerized expertise can be lost. In March, the New York Times
reported that IBM had told the Federal Aviation Administration that, come
the millennium, the existing system would stop functioning reliably. IBM's
advice was to completely replace the system because, they said, there was
"no one left who understands the inner workings of the host computer."
No one left who
understands. Air-traffic control systems, bookkeeping, drafting, circuit
design, spelling, differential equations, assembly lines, ordering systems,
network object communications, rocket launchers, atom-bomb silos, electric
generators, operating systems, fuel injectors, CAT scans, air conditioners
-- an exploding list of subjects, objects and processes rushing into code,
which eventually will be left running without anyone left who understands
them. A world full of things like mainframe computers, which we can use
or throw away, with little choice in between. A world floating atop a sea
of programs we've come to rely on but no longer truly understand or control.
Code and forget; code and forget: programming as a collective exercise
in incremental forgetting.
Every visual programming
tool, every wizard, says to the programmer: No need for you to know this.
What reassures the programmer -- what lulls an otherwise intelligent, knowledge-seeking
individual into giving up the desire to know -- is the suggestion that
the wizard is only taking care of things that are repetitive or boring.
These are only tedious and mundane tasks, says the wizard, from which I
will free you for better things. Why reinvent the wheel? Why should anyone
ever again write code to put up a window or a menu? Use me and you will
be more productive.
Productivity has always
been the justification for the prepackaging of programming knowledge. But
it is worth asking about the sort of productivity gains that come from
the simplifications of click-and-drag. I once worked on a project in which
a software product originally written for UNIX was being redesigned and
implemented on Windows NT. Most of the programming team consisted of programmers
who had great facility with Windows, Microsoft Visual C++ and the Foundation
Classes. In no time at all, it seemed, they had generated many screenfuls
of windows and toolbars and dialogs, all with connections to networks and
data sources, thousands and thousands of lines of code. But when the inevitable
difficulties of debugging came, they seemed at sea. In the face of the
usual weird and unexplainable outcomes, they stood a bit agog. It was left
to the UNIX-trained programmers to fix things. The UNIX team members were
accustomed to having to know. Their view of programming as language-as-text
gave them the patience to look slowly through the code. In the end, the
overall "productivity" of the system, the fact that it came into being
at all, was the handiwork not of tools that sought to make programming
seem easy, but the work of engineers who had no fear of "hard."
And as prebuilt components
accomplish larger and larger tasks, it is no longer only a question of
putting up a window or a text box, but of an entire technical viewpoint
encapsulated in a tool or component. No matter if, like Microsoft's definition
of a software object, that viewpoint is haphazardly designed, verbose,
buggy. The tool makes it look clean; the wizard hides bad engineering as
well as complexity.
In the pretty, visual
programming world, both the vendor and programmer can get lazy. The vendor
doesn't have to work as hard at producing and committing itself to well-designed
programming interfaces. And the programmer can stop thinking about the
fundamentals of the system. We programmers can lay back and inherit the
vendor's assumptions. We accept the structure of the universe implicit
in the tool. We become dependent on the vendor. We let knowledge about
difficulty and complexity come to reside not in us, but in the program
we use to write programs.
No wizard can possibly
banish all the difficulties, of course. Programming is still a tinkery
art. The technical environment has become very complex -- we expect bits
of programs running anywhere to communicate with bits of programs running
anywhere else -- and it is impossible for any one individual to have deep
and detailed knowledge about every niche. So a certain degree of specialization
has always been needed. A certain amount of complexity-hiding is useful
and inevitable.
Yet, when we allow
complexity to be hidden and handled for us, we should at least notice what
we're giving up. We risk becoming users of components, handlers of black
boxes that don't open or don't seem worth opening. We risk becoming like
auto mechanics: people who can't really fix things, who can only swap components.
It's possible to let technology absorb what we know and then re-express
it in intricate mechanisms -- parts and circuit boards and software objects
-- mechanisms we can use but do not understand in crucial ways. This not-knowing
is fine while everything works as we expected. But when something breaks
or goes wrong or needs fundamental change, what will we do but stand a
bit helpless in the face of our own creations?
- - - - - - - - -
- - -
|
| Title: Salon 21st | The dumbing-down of programming
|
THE
DUMBING-DOWN OF PROGRAMMING | PAGE 2 OF 2
- - - - - - - - -
- - - - - - - - - - - - - - - - - - - - -
Linux won't
recognize my CD-ROM drive. I'm using what should be the right boot kernel,
it's supposed to handle CD-ROMs like mine, but no: The operating system
doesn't see anything at all on /dev/hdc. I try various arcane commands
to the boot loader: still nothing. Finally I'm driven back to the HOW-TO
FAQs and realize I should have started there. In just a few minutes, I
find a FAQ that describes my problem in thorough and knowledgeable detail.
Don't let anyone ever say that Linux is an unsupported operating system.
Out there is a global militia of fearless engineers posting helpful information
on the Internet: Linux is the best supported operating system in the world.
The problem is the
way the CD-ROM is wired, and as I reach for the screwdriver and take the
cover off the machine, I realize that this is exactly what I came for:
to take off the covers. And this, I think, is what is driving so many engineers
to Linux: to get their hands on the system again.
Now that I know that
the CD-ROM drive should be attached as a master device on the secondary
IDE connector of my orphaned motherboard -- now that I know this machine
to the metal -- it occurs to me that Linux is a reaction to Microsoft's
consumerization of the computer, to its cutesying and dumbing-down and
bulletproofing behind dialog boxes. That Linux represents a desire to get
back to UNIX before it was Hewlett-Packard's HP-UX or Sun's Solaris or
IBM's AIX -- knowledge now owned by a corporation, released in unreadable
binary form, so easy to install, so hard to uninstall. That this sudden
movement to freeware and open source is our desire to revisit the idea
that a professional engineer can and should be able to do the one thing
that is most basic to our work: examine the source code, the actual program,
the real and unvarnished representation of the system. I exaggerate only
a little if I say that it is a reassertion of our dignity as humans working
with mere machine; a return, quite literally, to the source.
In an ideal world,
I would not have to choose between the extreme polarities of dialog box
and source code. My dream system interface would allow me to start hesitantly,
unschooled. Then, as I used the facility that distinguishes me from the
machine -- the still-mysterious capacity to learn, the ability to do something
the second time in a way quite different from the first -- I could descend
a level to a smarter, quicker kind of "talk." I would want the interface
to scale with me, to follow me as my interest deepened or waned. Down,
I would say, and it would let me get my way, however stupid or incomprehensible
this seemed to it, a mere program. Up, I could say, so I could try
something new or forgotten or lost just now in a moment of my being human,
nonlinear, unpredictable.
Once my installation
of Linux was working, I felt myself qualified, as a bona fide Linux user,
to attend a meeting of the Silicon Valley Linux User's Group. Linus Torvalds,
author of the Linux kernel and local godhead, was scheduled to speak. The
meeting was to be in a building in the sprawling campus of Cisco Systems.
I was early; I took a seat in a nearly empty room that held exactly 200
chairs. By the time Torvalds arrived half an hour later, more than twice
that many people had crowded in.
Torvalds is a witty
and engaging speaker, but it was not his clever jokes that held the audience;
he did not cheerlead or sell or sloganize. What he did was a sort of engineering
design review. Immediately he made it clear that he wanted to talk about
the problem he was just then working on: a symmetrical multiprocessing
kernel for Linux. For an hour and a half, the audience was rapt as he outlined
the trade-offs that go into writing an operating system that runs on multiple
processors: better isolation between processes vs. performance; how many
locks would be a good number, not too many to degrade response, not so
few to risk one program stepping on the memory area of another; what speed
of processor should you test on, since faster processors would tend to
minimize lock contention; and so on through the many countervailing and
contradictory demands on the operating system, all valid, no one solution
addressing all.
An immense calm settled
over the room. We were reminded that software engineering was not about
right and wrong but only better and worse, solutions that solved some problems
while ignoring or exacerbating others. That the machine that all the world
seems to want to see as possessing some supreme power and intelligence
was indeed intelligent, but only as we humans are: full of hedge and error,
brilliance and backtrack and compromise. That we, each of us, could participate
in this collaborative endeavor of creating the machine, to the extent we
could, and to the extent we wished.
The next month, the
speaker at the Silicon Valley Linux User's Group is Marc Andreesen, founder
of Netscape. The day before, the source code for Netscape's browser had
been released on the Internet, and Andreesen is here as part of the general
celebration. The mood tonight is not cerebral. Andreesen is expansive,
talks about the release of the source code as "a return to our roots on
a personal level." Tom Paquin, manager of Mozilla, the organization created
to manage the Netscape source code, is unabashed in his belief that free
and open source can compete with the juggernaut Microsoft, with the giants
Oracle and Sun. He almost seems to believe that Netscape's release of the
source isn't an act of desperation against the onslaught of the Microsoft
browser. "Technologists drive this industry," he says, whistling in the
dark. "The conventional wisdom is it's all marketing, but it's not."
Outside, a bus is
waiting to take the attendees up to San Francisco, where a big party is
being held in a South of Market disco joint called the Sound Factory. There
is a long line outside, backed up almost to the roadway of the Bay Bridge.
Andreesen enters, and he is followed around by lights and cameras like
a rock star. In all this celebration, for just this one night, it's almost
possible to believe that technologists do indeed matter to technology,
that marketing is not all, and all we have to do is get the code to the
people who might understand it and we can reclaim our technical souls.
Meanwhile, Andreesen
disappears into a crush of people, lights flash, a band plays loudly and
engineers, mostly men, stand around holding beer bottles. Above us, projected
onto a screen that is mostly ignored, is what looks like the Netscape browser
source code. The red-blue-green guns on the color projector are not well
focused. The code is too blurry, scrolling by too quickly, to be read.
SALON | May 13,
1998
Ellen
Ullman is a software engineer. She is the author of "Close
to the Machine: Technophilia and its Discontents."
-
- - - - - - - - - - - - - - - - - - - - - -
T A B L E_.T
A L K
Is programming being
"dumbed-down"? Are "easy" programming tools and wizards changing the nature
of programming? Is knowledge disappearing into code? Come to Table Talk's
Digital
Culture area and talk about Ellen Ullman's "The Dumbing-Down of Programming."
-
- - - - - - - - - - - - - - - - - - - - - -
R E L A T
E D_.S A L O
N_.S T O R I
E S
Disappearing
into the code: A deadline brings programmers to the place of no shame.
Excerpt from "Close to the Machine."
By Ellen Ullman
Oct. 9, 1997
Sliced
off by the cutting edge: It's impossible for programmers to keep up
with every trend even when they're eager and willing. What happens when
they despair? Excerpt from "Close to the Machine."
By Ellen Ullman
Oct. 16, 1997
Elegance
and entropy: An interview with Ellen Ullman.
By Scott Rosenberg
Oct. 9, 1997 |
|
|
|