Bill Jonas on Thu, 22 Jun 2000 14:33:30 -0400 (EDT)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] Video Adapter Imcompatibility


On Thu, 22 Jun 2000, Jeff Abrahamson wrote:

>  - XF86 3.x does not have support for graphics card acceleration.
>    XF86 4.x will but isn't finished yet.

XF86 3.x does indeed have support for hardware acceleration (well, in
reality, it's all hardware, with "hardware" meaning the
accelerator/graphics card, and "software" meaning the CPU).  The
difference between 3.x and 4.x is that they've taken a more modular
approach with 4.x; instead of having many different X servers (video
drivers, of course), you tend to have just a handful, with module-like
pieces of code that plug in, that are chipset-specific.

The issues with 4.0 right now are that it's a radically different
architecture than 3.3.x and earlier, so there are naturally more bugs to
be worked out, since the old model is relatively mature.

>  - Somehow, whether cached on the graphics card or in main memory
>    manipulated by the CPU, there's a bitmap of the display. This
>    bitmap is called the frame buffer.

Yes, generally, although in this context "framebuffer" generally means
the memory area in which the CPU manipulates the display, as opposed to
accelerator/card-based "hardware" acceleration (in quotes because
"software" acceleration, as stated above -- the CPU is most definitely
a piece of hardware <g>).

>  - In the case of hardware acceleration, the display server (X) says
>    something like "draw a line from here to here." The card updates
>    its bitmap accordingly. Without hardware acceleration, X has to
>    implement Bresenham (sp?) on its own to draw that line. And so on
>    for other graphics operations.

Pretty much (I'll take your word on what it's called <g>); with hardware
acceleration, the graphics card does the bulk of the work.  In the
absence of it, the CPU is required to *all* the work.

>So I've been confused by the above thread's reference to "framebuffer
>support." What does it mean to have a display without a backing
>bitmap? Is a frame buffer something different?

Basically, in this context (peecee-land), "framebuffer" refers to an
abstraction of the graphics process.  You see, with graphics accleration
on the PC, there is a graphics co-processor on the card, which has its
own memory (VRAM or DRAM or whatnot).  In the absence of this, a portion
of main memory is set aside for the CPU to do all the graphics work,
line drawing and the like.  The advantage of this, of course, is that
you then write one driver/server and through some black magic and
hand-waving <g> it auto-magically works with nearly every card.  (The
alternative, to take your example, is to write servers that know how to
speak each chipset's different language, so that when X tells the card
to draw a line from point A to point B, the card knows what the X server
is saying.)  You can also get some nifty resolutions in text mode, like
1024X768, *and* you can have a spiffy li'l penguin logo on bootup.  :)

The disadvantage, obviously, is that the CPU has to do all the work, and
it becomes quite slow at anything approaching resolutions and color
depths that most people use these days.  (IOW, speaking from experience,
it ain't all that fun on a Cyrix MII-333.)  The PC framebuffer (you can
see it in the kernel config if you tell the config program that you want
to see experimental code) is modeled after the code that was written for
SPARCs and other non-PC architectures.  There's also a
Framebuffer-[mini-]HOWTO at linuxdoc.org that tells you how, and
probably contains better information than what I've provided, and it's
fairly recent.

>I'd be grateful if someone could clarify for me what reality and/or
>the thread is actually about?

It's basically about new support that the kernel has for the x86
architecture for graphics abstraction (software-driven), making the
kernel (and by extension, your main CPU) do the work of a video card,
with the video card becoming little more than a pipeline to the monitor.

Now, I'm no graphics expert by any means, but as I alluded to earlier,
I've had to use this feature (due to an integrated graphics accelerator
that didn't have quite the chipset that it claimed, on the motherboard).  
And yes, it still is experimental; Netscape would crash whenever it
started Java, before I got my 3Dfx.  :)  If there's something that I got
wrong, then by all means, somebody who knows more about it than I step
in and correct me.

HTH,
Bill
-- 
>Ever heard of .cshrc?             | "Linux means never having to delete
That's a city in Bosnia. Right?    |  your love mail." -- Don Marti
(Discussion in comp.os.linux.misc  |  http://www.netaxs.com/~bj/
on the intuitiveness of commands.) |  http://www.harrybrowne.org/


______________________________________________________________________
Philadelphia Linux Users Group       -       http://plug.nothinbut.net
Announcements - http://lists.nothinbut.net/mail/listinfo/plug-announce
General Discussion   -   http://lists.nothinbut.net/mail/listinfo/plug