Sysop: | Amessyroom |
---|---|
Location: | Fayetteville, NC |
Users: | 43 |
Nodes: | 6 (0 / 6) |
Uptime: | 104:22:00 |
Calls: | 290 |
Files: | 905 |
Messages: | 76,612 |
From my point of view main drawbacks of 286 is poor support for
large arrays and problem for Lisp-like system which have a lot
of small data structures and traverse then via pointers.
Yes. In the first case the segments are too small, in the latter case
there are too few segments (if you have one segment per object).
According to Anton Ertl <anton@mips.complang.tuwien.ac.at>:
antispam@fricas.org (Waldek Hebisch) writes:
From my point of view main drawbacks of 286 is poor support for
large arrays and problem for Lisp-like system which have a lot
of small data structures and traverse then via pointers.
Yes. In the first case the segments are too small, in the latter case >>there are too few segments (if you have one segment per object).
Intel clearly had some strong opinions about how people would program
the 286, which turned out to bear almost no relation to the way we
actually wanted to program it.
Some of the stuff they did was just perverse, like putting flag
bits in the low part of the segment number rather than the high
bit. If you had objects bigger than 64K, you had to shift
the segment number three bits to the left when computing
addresses.
They also apparently didn't expect people to switch segments much.
If you loaded a segment register with the value it already contained,
it still fetched all of the stuff from memory.
How many gates would
it have taken to check for the same value and bypass the loads?
If
they had done that, we could have used large model calls everywhere
since long and short calls would be about the same speed, and not
had to screw around deciding what was a long call and what was short
and writing glue codes to allow both kinds.
In article <6d5fa21e63e14491948ffb6a9d08485a@www.novabbs.org>, >mitchalsup@aol.com (MitchAlsup1) wrote:
On Sun, 5 Jan 2025 2:56:08 +0000, John Levine wrote:
They also apparently didn't expect people to switch segments much.
Clearly. They expected segments to be essentially stagnant--unlike
the people trying to do things with x86s...
An idea: The target markets for the 8080 and 8085 were clearly embedded >systems. The Z80 and 6502 rapidly became popular in the micro-computer >market, but the 808[05] did not. Intel may still have been thinking in
terms of embedded systems when designing the 80286.
On Sun, 5 Jan 2025 2:56:08 +0000, John Levine wrote:
They also apparently didn't expect people to switch segments much.
Clearly. They expected segments to be essentially stagnant--unlike
the people trying to do things with x86s...
An idea: The target markets for the 8080 and 8085 were clearly embedded >systems. The Z80 and 6502 rapidly became popular in the micro-computer >market, but the 808[05] did not.
Intel may still have been thinking in
terms of embedded systems when designing the 80286.
The IBM PC was launched in August 1981 and around a year passed before it >became clear that this machine was having a huge and lasting effect on
the market. The 80286 was released on February 1st 1982, although it
wasn't used much in PCs until the IBM PC/AT in August 1984.
The 80386 sampled in 1985 and was mass-produced in 1986. That would seem
to have been the first version of x86 where it was obvious at the start
of design that use in general-purpose computers would be important.
Anyway, while Zilog may have taken their sales, I very much believe
that Intel was aware of the general-purpose computing market, and the
iAPX432 clearly showed that they wanted to be dominant there. It's an
irony of history that the 8086/8088 actually went where the action
was.
jgd@cix.co.uk (John Dallman) writes:
An idea: The target markets for the 8080 and 8085 were clearly embedded >>systems. The Z80 and 6502 rapidly became popular in the micro-computer >>market, but the 808[05] did not.
The 8080 was used in the first microcomputers, e.g., the 1974 Altair
8800 and the IMSAI 8080. It was important for all the CP/M machines,
because the CP/M software (both the OS and the programs running on it)
were written to use the 8080 instruction set, not the Z80 instruction
set. And CP/M was the professional microcomputer OS before the IBM PC compatible market took off, despite the fact that the most popular microcomputers of the time (such as the Apple II, TRS-80 ad PET) did
not use it; there was an add-on card for the Apple II with a Z80 for
running CP/M, though, which shows the importance of CP/M.
Anyway, while Zilog may have taken their sales, I very much believe
that Intel was aware of the general-purpose computing market, and the
iAPX432 clearly showed that they wanted to be dominant there. It's an
irony of history that the 8086/8088 actually went where the action
was.
Intel released the MCS-51 (aka 8051) in 1980 for embedded systems, and
it's very successful there, and before that came the MCS-48 (8048) in
1976.
Intel may still have been thinking in
terms of embedded systems when designing the 80286.
I very much doubt that the segments and the 24 address bits were
designed for embedded systems. The segments look more like an echo of
the iAPX432 than of anything designed for embedded systems.
The idea of some static allocation of memory for which segments might
work may come from old mainframe systems, where programs were (in my impression) more static than PC programs and modern computing. Even
in Unix programs, which were more dynamic than mainframe programs had
quite a bit of static allocation in the early days; this is reflected
in the paragraph in the GNU coding standards:
|Avoid arbitrary limits on the length or number of any data structure, |including file names, lines, files, and symbols, by allocating all
|data structures dynamically. In most Unix utilities, "long lines are |silently truncated". This is not acceptable in a GNU utility.
The IBM PC was launched in August 1981 and around a year passed before it >>became clear that this machine was having a huge and lasting effect on
the market. The 80286 was released on February 1st 1982, although it
wasn't used much in PCs until the IBM PC/AT in August 1984.
The 80286 project was started in 1978, before any use of the 8086. <https://timeline.intel.com/1978/kicking-off-the-80286> claims that
they "spent six months on field research into customers' needs alone"; Judging by the results, maybe the customers were clueless, or maybe
Intel asked the wrong questions.
The 80386 sampled in 1985 and was mass-produced in 1986. That would seem
to have been the first version of x86 where it was obvious at the start
of design that use in general-purpose computers would be important.
Actually, reading the oral history of the 386, at the start the 386
project was just an unimportant followon of the 286, while the main
action was expected to be on the BiiN project (from which the i960
came). Only sometime during that project the IBM PC market exploded
and the 386 became the most important project of the company.
But yes, they were very much aware of the needs of programmers in the
386 project, and would probably have done something with just paging
and no segments if they did not have the 8086 and 80286 legacy.
- anton
According to Anton Ertl <anton@mips.complang.tuwien.ac.at>:
Anyway, while Zilog may have taken their sales, I very much believe
that Intel was aware of the general-purpose computing market, and the >>iAPX432 clearly showed that they wanted to be dominant there. It's an >>irony of history that the 8086/8088 actually went where the action
was.
I have heard that the IBM PC was originally designed with a Z80, and
fairly late
in the process someone decided (not unreasonably) that it wouldn't be different
enough from all the other Z80 boxes to be an interesting product. They
wanted a
16 bit processor but for time and money reasons they stayed with the 8
bit bus
they already had. The options were 68008 and 8088. Moto was only
shipping
samples of the 68008 while Intel could provide 8088 in quantity, so they
went
with the 8088.
If Moto had been a little farther along, the history of the PC industry
could have been quite different.
According to Anton Ertl <anton@mips.complang.tuwien.ac.at>:
Anyway, while Zilog may have taken their sales, I very much believe
that Intel was aware of the general-purpose computing market, and the
iAPX432 clearly showed that they wanted to be dominant there. It's an
irony of history that the 8086/8088 actually went where the action
was.
I have heard that the IBM PC was originally designed with a Z80, and fairly late
in the process someone decided (not unreasonably) that it wouldn't be different
enough from all the other Z80 boxes to be an interesting product. They wanted a
16 bit processor but for time and money reasons they stayed with the 8 bit bus
they already had. The options were 68008 and 8088. Moto was only shipping samples of the 68008 while Intel could provide 8088 in quantity, so they went with the 8088.
If Moto had been a little farther along, the history of the PC industry
could have been quite different.
On Sun, 5 Jan 2025 20:01:25 +0000, John Levine wrote:
According to Anton Ertl <anton@mips.complang.tuwien.ac.at>:
Anyway, while Zilog may have taken their sales, I very much believe
that Intel was aware of the general-purpose computing market, and the
iAPX432 clearly showed that they wanted to be dominant there. It's an
irony of history that the 8086/8088 actually went where the action
was.
I have heard that the IBM PC was originally designed with a Z80, and
fairly late
in the process someone decided (not unreasonably) that it wouldn't be
different
enough from all the other Z80 boxes to be an interesting product. They
wanted a
16 bit processor but for time and money reasons they stayed with the 8
bit bus
they already had. The options were 68008 and 8088. Moto was only
shipping
samples of the 68008 while Intel could provide 8088 in quantity, so they
went
with the 8088.
If Moto had been a little farther along, the history of the PC industry
could have been quite different.
If Moto had done 68008 first, it may very well have turned out
differently.
I do believe that IBM did seriously consider the risk of making the
PC too good, so that it would compete directly with their low-end
systems (8100?).
In article <vlervh$174vb$1@dont-email.me>, terje.mathisen@tmsw.no (Terje Mathisen) wrote:
I do believe that IBM did seriously consider the risk of making the
PC too good, so that it would compete directly with their low-end
systems (8100?).
I recall reading back in the 1980s that the PC was intended to be
incapable of competing with the System/36 minis, and the previous
System/34 and /32 machines. It rather failed at that.
John
On Thu, 1 Jan 1970 0:00:00 +0000, John Dallman wrote:
In article <vlervh$174vb$1@dont-email.me>, terje.mathisen@tmsw.no
(Terje Mathisen) wrote:
I do believe that IBM did seriously consider the risk of making the
PC too good, so that it would compete directly with their low-end
systems (8100?).
I recall reading back in the 1980s that the PC was intended to be
incapable of competing with the System/36 minis, and the previous
System/34 and /32 machines. It rather failed at that.
Perhaps IBM should have made them more performant !?!