Ok, so where are the SMP Spec results? How have the compilers and operating system been enhanced to take advantage of SMP? What applications (other than fricking Photoshop) will be available that can take advantage of this second processor? How exactly is a stupid cooperatively multitasking operating system supposed to use a completetly asynchronous resource like a whole other CPU? Where the *F* is OSX?!?! Did Apple work with any Linux vendors to make sure they could take advantage of the second processor?
Update: Just as a point of clarification, the "dual G4's" are not capable of "SMP" using OS 9. It is strictly a client-slave arrangement (much like a graphics cards plays a subserviant role to the main CPU.) Although OS X beta has been released, (SMP based) benchmark results are not forthcoming (big surprise ...)
Tech-Report has a nice little reaction to the whole thing.
Check them out at his site
The page also shows estimated scores for the Motorola G4-500 as being basically identical to the G3. This makes perfect sense since the library does not use any of AltiVec or MMX or SSE or 3DNow!
This benchmark tests raw integer ALU speed, roughly in "operations per second" normallized by the theoretical performance of an Alpha 21264-750Mhz (it should be noted that the real Alpha tests do not show scores of 100.)
Update: (12/21/00) there are new scores with actual runs for the G3 (as opposed to projections):
The G4 did post some gains (probably due to over projections of the Alpha results) over the previous version of the benchmark but both x86 processors gained even more. I mean holy cow look at that Athlon! Its basically the fastest processor on earth for multiprecision adds and multiprecision shifts. (Note that the IE browser has difficulty with table rendering -- the Alpha's 93.5 score in lshift is *lower* than the Athlon's 100.)
The mac consistently loses to the PC as shown by the chart below (higher is better).
Notice that you cannot divide out the Mhz here to give some sort of dellusional number to represent how fast the architecture is. The benchmark is very memory intensive, and as such is not very clock rate dependent. Of course you might complain that we are really comparing 133Mhz RAM versus 100Mhz RAM ... but the fact that it has available for some time in the x86 market and is not in the PPC market is kind of the whole point.
Update: (06/16/01) Some updated results:
Of course the real story here is the dramatic improvements made by Intel. Focussing just on the Apple results, we see that they are still playing catch up to results from 12 months earlier -- a standard Athlon now has nearly 2x the performance.
See screen-shot here
This is easily reproduced (with Quicktime alone -- not any other PC application).
They wrote the following:
Oh how I enjoy people's estimation of my age and intelligence (which is commonly followed up by people correcting my spelling mistakes.) Anyways, the last sentence is the culprit I was looking for. Their David Koresh is outright telling his fanatic following to come throw stones at me. Notice that they clearly didn't bother to read past the first few entries of this page. I'll just tersly respond to this:
Here's the letter (slightly modified for technical reasons) from my informant who blew the whistle on mactoday.com (to the author: if you would like your name published, just email me and I will):
I must say, I definately appreciated this letter. Its refreshing to know that even among Mac users, there are reasonable, clear headed people who will at least make a reasonable effort to understand what I've written. I'm going to respond openly here, just with the intention of clearing up what I may not have made as clear as I probably should have:
The "Gates bought our company" comment was actually not written by me. Some time ago when Apple did a complete redesign of their web site, they had a three section image which is shown on the top right of this web page (the cookie, the shopping cart and the screwdriver). At about this time, Microsoft surprised many be investing $150M into Apple. Shortly thereafter on rec.humor.funny a joke was posted with the three captions shown. I thought it was so hilarious I compositied and posted the image here. I suspect either my use of the image as shown, or the rec.humor.funny post may have motivated Apple to change the images on their website shortly thereafter. Yeah, its old. If I get inspired by something else maybe I will go change it. Any suggestions?
Of course, I'm not attached to my os.
Yeah the ihateapple.com is quite militant. But I justify it as a counter balance to sites like mactoday.com which are just as militant in the opposite direction. The truth is that I only added it a few days ago as a kind of serrendipidous (sp?) side effect of searching for the cult that was encouraging the flames I've received recently.
Indeed every once in a while, I get the urge to clean up this anti-Apple web page like my anti-Microsoft page. But I consider this page to be a very part time kind of thing. I just can't justify the effort to really clean it up. This may seem contradictory considering how large this page is, but you have to understand that it was written up in fits of energy over a very long period of time. I have almost certainly put 10 times the work into my anti-Microsoft page.
As to my C/C++ discussion ... well unfortunately a detailed analysis of languages takes kind of an enormous amount of work. Its also a religious kind of thing. No matter how well I did such an analysis, you get morons who are one side of the religious fence who will just feel like flaming me without hearing me out (sound familliar?)
Interestingly, recently I did a mini-analysis of Scheme versus C (just analyzing a single benchmark proposed by proponents of Scheme). There was a claim that Scheme was 21 times faster than C -- I showed that even theoretically it could not be more than 3 times, and that in the example given it was highly unlikely that it would even beat the C code performance under comparable implementation conditions. I put many hours of work into my analysis, but the Scheme proponent who I was aiming this at (Bruce Hoult, who is a PowerPC proponent as well -- go figure) decided that, even without reading through it, that my analysis was flawed. His wording was meant to attack my credibility, which he is fond of doing on comp.arch.
I still want to do it (in the hopes that the *entire* audience wouldn't be a bunch of Bruce Hoult's), as I feel the C and C++ war is an unresolved question in the minds of many. I just need the time to do it and a big boost of energy.
But in this sea of stupidity is finally the voice of reason.
Read all about "Processor Wars" on holymac.com
I am guessing that Paul Demone's article has caused a stir on some forum somewhere, a raging flame war has ensued and my web page has been cited. Anyone feel like confirming or denying it?
My other theory which is more conspiratorial, is that some more organized Mac advocates have decided to target me. Some organization is telling their cult to mail in to me to tell me what they think of my page.
Either way, don't expect me to take this page down any time soon.
You mean Apple will not die because they've made translucent cases, and hired back Steve Jobs? Well, I will admit that Steve is an excellent speaker, though why Intel hasn't sued him for libel is beyond me.
Making a see through box is not "transforming" -- its called a "fashion statement". Its a one trick pony that doesn't have any longevity (its too easy for PC manufacturers to do their own colorful computers.)
The only reason why Apple might have the best selling kind of computer of some particular category is because they have a monopoly position on the MacOS market. Collectively, the "Wintel" manufacturers stomp all over Apple in any category.
Apple is experiencing a resurgence right now (this page was written while Gil Amelio, aka Dan Quale, was head of Apple) however, so far Apple has not demonstrated any interesting long term strategies other than "OS X" which is still an unknown quantity at this point. (Just as Windows 2000 and the ultimate impact of the various linux distributions on the PC are unknown quantities.)
This is Steve Job's propaganda and its simply untrue. Every demonstration he has done has been rigged in some way, and his conclusions are gross mistatements. The performance of a top of the line PowerPC G4 (450Mhz) running MacOS is nowhere near the performance of a top of the line x86 (either Athlon @ 800Mhz or Pentium !!!B @ 800Mhz) running Windows.
The funny thing is that at the previous MacWorld, Steve ran a demonstration showing the G4 450 running some bogus test where they projected that a comparable Intel CPU would need to be running 800Mhz. Several months later, Motorola has been unable to up the frequency and Intel just goes ahead and ships this chip that Steve said would take "some time" for Intel to deliver.
So even in the most extreme cases of rigged tests, the top of the line x86 CPUs still beat the top of the line G4.
And don't give me any nonsense about Motorola coming out with something better "soon". Intel and AMD have plans too, and they will be leaving the poor PowerPC architecture in the dust.
Absolutely not true. Its just that Steve's demos are consistently fraudulant. The last head to head benchmark that Apple has done was against a 500Mhz Pentium !!! at a time when Athlon 650's were available and the Motorola CPU he was using was not yet available.
Its very simple -- the PC platform has more choices for hardware expansion and not all configurations are robust. Different vendors with different emphases will have different degrees of "problems".
For example, HP Pavillions, typically are not the fastest configured x86 boxes but have a reliability and ease of use that would easily rival a Macintosh. On the contrary, GateWay and Micron are much more aggressive, constantly trying to win performance wars -- as such they commonly choose configurations which are not as robust, but which deliver greater performance when its working.
Its called consumer choice. With the Mac, you have only one choice. Apple's choice. Now Apple has quite simply chosen to sacrifice performance for stability, which makes perfect sense these days since no configuration of a Mac can match the performance of a high-end x86 box.
Untrue. Numerous informal polls indicate that upwards of 30% of all computer users play games on their computer. Apple knows this which is why they are trying to revitalize their platform for games (adoption of OpenGL, endorsement from id software, etc) -- but they have a serious uphill battle. They are competing with AGP 4x, and CPUs with monster floating point capabilities that will not be matched by the G4's piddly AltiVec instruction set.
No, it indicates that the game market is saturated and that there are a *lot* of games out there. Walk down the aisles of any software shop. There will be a very large section just for games. Then walk down it again in 3 months -- most of the games will be different. Everyone is just playing different games. At this point you are just making things up about reality.
That's true. Apple has woken up to the reality that competing against PC's head to head on standard consumer applications is a no win scenario. They have simply picked yet another niche (graphics design being their previous niche) and are touting it as the next big thing. Only a loyal Mac die hard is so easily swayed by this. Update: Apparently such firewire + digital camera solutions have existed for PC's starting pretty much at the same time frame. The big difference is that its such a niche market, that is barely gets any air play in the PC arena where more credible features are used to influence your PC purchase.
That sub $1000 Mac is a piece of garbage outside of the fact that they've bundled it with its video editting capabilities (Update: a demonstration on ZDTV's "Fresh Gear" showed the iMac platform to be very non-realtime and very non-linear when performing some standard video composition effects.) They are selling you a "pre-obsoleted machine" into which they are going to try to build this video editting market. I don't think this latest ploy is going to go nearly as well as their flourescent cases.
FireWire is "too early". There simply are not enough devices to justify it. USB is simply a better solution for today. FireWire may inevitably be adopted by PC manufacturers as well, but its not because they can't put this into PCs cheaply today. Its simply the realization on the PC manufacturer's part that this is not a justified cost for today. For example, USB-II looks to seriously marginalize the utility of FireWire.
The Mac market relies heavy on this dillusion that Mac's are somehow better than PC's because of thier useless doodads. I'm sure Apple feels that adding FireWire is necessary, not because there are enough devices out there to justify it, but because it makes an aritificial stab at the PC market.
You know ordinarly, I would not be so quick to judge what you know or don't know, but only by admitting to being a "helpdesk person" have I formulated the opinion that you are a "Mac geek, that knows nothing of PCs". Sorry, but my experience with "help desk people" is that they generally don't know what they are doing -- they are typically about 2 years behind the general knowledge curve regarding the computer industry.
Just for reference, I am software developer that works closely with hardware devices. I am reasonably well known for by "programming optimization" ability as well as my understanding of hardware from a performance point of view. In the past, I have informally helped "Tom's hardware guide" in understanding performance issues.
Kind of puts things into perspective doesn't it?
Sounds like baloney to me. I don't think any video card vendor would in their right mind release drivers that disallowed the use of the most popular software application on the planet. Your comments don't seem to be in tune with "reality". This sounds more like what you as a die hard Mac user wants to believe rather than what the reality of the situation is.
See my comments above about choice.
5/11/99 While on this decidely anti-Apple web site it will not surprise anyone that I basically claim that Apple is lying with their performance claims versus x86 CPUs, it turns out that the internet news community on comp.arch has chimed in with a similar sentiment.
Now the folks on this newsgroup are probably amoung the most technical savvy in terms of being able to compare and assess the value of a CPU architecture. The people are well known in their circle, and the topics discussed are usually well grounded in terms state of the art processor design.
A thread started innocently enough with the following message:
Hi, I'm badly in need of some material in web site or advertisements that make wrong statements about some arch's performance, such as emphasizing on one part of performance while ignoring other parts. The more egregious, the better.
Have you seen such things, many thanks!!
Now there is a fairly even representation of processor architecture folks on this newsgroup, as well as bold claims made by most processor vendors. So one would naturally expect this kind of thread to degenerate into a lot of back and forth between each processor proponent's opinions about other vendors claims that are on shaky ground. But that is not what happened at all. There seems to have been only one theme to follow -- Just how badly has Apple been lying in their advertised performance claims for the past few years.
In the 22 messages that followed this one they all had basically the same thing to say:
The classic example is the PowerPC vs x86 ads some years ago, comparing predicted performance of future PPC's, including some that never was delivered, against the then current Pentium chip.
The ad very clearly implied that at the time of the ad, PPC was twice as fast as the best x86 (NOT!), and that this difference would increase exponentially.
How about Apple's use of BYTEMark numbers to claim that the G3 was -- twice? three times? as fast as a Pentium II?
Anthony D. Tribelli:
[...] The advantage to Apple was that the PC binary (for ByteMark) had not been updated in years. Even when new the PC binary was troublesome, the base PC system (score = 1.0) was a Compaq Pentium 90 of some model but the binary was optimized for 486 rather than Pentium. Many years later, this 486 binary was still sitting on Byte's website.
A problem of equal magnitude is that the source is simply naive. It tests the compiler as much as it does the CPU. One can get a 2X speed difference on a Mac just by using two different compilers.
When the best available PC compiler and best available Mac compiler were both used (at the time of the original Bytemark-based ads - I don't know about current compilers/systems), Mac showed a 25-35% advantage.
Here are the scores from ZDNet, from their review of the iMac. They mention the apparent dishonesty of Apple marketing with regard to this benchmark. Personally, I consider it out and out lying. No one is going to tell me that Apple didn't try recompiling the code, and didn't know that the scores were totally inaccurate.
The first line is the numbers that Apple got from an executable that they found on the web, reportedly compiled for the 486. The second line is the result that ZD got with an up to date compiler in September, 1998. It looks like to me that the P2 was actually twice as fast as an iMac.
If you check the SPEC, Apple and Intel websites, you will see that the same thing is happening with the latest G4 machines: Apple claims overwhelming superiority, but is actually below average. [...]
In its desperate attempt to stay no more than 200Mhz behind Intel and three years behind the rest of the industry (read: SIMD) Apple prematurely announced -- oh I don't even know why I bother. Lets get right into it with a brief outline of their lies.
Appearing on Apple's website at the time of the launch:
"The first supercomputer on a chip"
This was actually claimed and delivered upon by a little known company named Chromatic Research in 1998. This claim is typically based on the idea of having DSP-like calculating functionality (essentially SIMD) that delivers performance similar to Cray super computers of yesteryear. AltiVec (now called "Velocity Engine") is nothing more than catch up to AMD and Intel, each of which have have evolved similar SIMD functionality in products that have been actually delivered to customers for years now.
I have assurances from university researchers who are familiar with Cray supercomputer performance levels, that the K6-2 (which was shipped in May of 1998) also had similar functionality and performance levels to a single CPU "Cray-3". To call themselves first is just a lie.
Just to put things into perspective, the main use of SIMD technology today is to deliver higher performance games as well as higher quality video playback. To this end, x86 based PC's have game performance that is simply not even approached by the Mac platform and most mid-range x86 PC's can deliver DVD playback which meets its highest quality standards. I don't believe the Macintosh platform has any opportunity here except to attempt to catch up to where the x86 PC already is today.
The image above is very suggestive that the Pentium !!! is a two way execution engine, while the PowerPC G4 is a 4-way execution engine. This is an extremely gross exaggeration. When using ISSE, the Pentium-!!! can generate 4 results per clock with fewer instruction bytes than "Velocity Engine".
The Pentium-!!! can run two SIMD calculations at once, while the "Velocity Engine" really only has one SIMD calculation engine (and thus is not even superscalar in its "Velocity Engine").
On non-branch instructions, the Pentium-!!! is a 3-way execution engine, while the PowerPC is only a 2-way execution machine.
"The new PowerPC G4, architected by Apple, Motorola and IBM, is the first microprocessor that can deliver a sustained performance of over one gigaflop. In fact, it has a theoretical peak performance of four gigaflops."
That's nonsense. The very nature of the Intel and AMD SIMD instruction sets allows them to easily exceed 1 GFLOP in most situations. In truth the theoretical maximum of 2GFLOPs for a 500mhz P-!!! or K6-2 or Athlon is very easy to come very close to on real world code. Its part of what is leading to continually higher and higher frame rates of games like Quake. (I wonder if Apple is at all interested in running timedemos head to head with comparable PCs.)
"Velocity Engine"'s ability to hit 4 GFLOPs requires that back to back MADDs are peformed on every clock (counting a MADD as two operations). No other combination of "Velocity Engine" code will deliver that level performance.
Update: A deeper analysis of the PowerPC G4 architecture shows the 4 GFLOP claim to be an outright lie. The "completion queue" limits the processor to 6 AltiVec instructions out of every 8 instruction sequence without stalling. So even stuffing the remaining two instructions with single precision FP-MADD instructions leads to an absolutely peak of 3.5 GFLOPs. Of course constructing such code which corresponds to a real world algorithm is basically impossible to come by
"The Power Mac G4 comes with an ATI RAGE 128 graphics accelerator."
Well good for you. As of 08/31/99, the ATI Rage 128 was two generations behind the state of the art. nVidia announced the availability of the GeForce 256 graphics accelerator which adds 50 GFLOPs of performance to an x86 enabled PC, in addition to an unparalleled level of graphics performance. This obsoletes the nVidia TNT2-Ultra which in turn obsoleted the ATI Rage 128. In Job's otherwise well calculated drive to subsume existing PC technologies, he failed to realize the incredible volatility of the graphics industry and has ended up with an obsolete lemon.
Other things to note: At 450Mhz, SpecInt/FP95 are estimated to be 21.4/20.4. The Int score is simply not competitive with Intel or AMD, and the FP score (while ahead of Intel) is still well behind where AMD's Athlon is at. Given that the G4 is a RISC chip its staggeringly pathetic that they cannot match the floating point performance of the latest x86 chip. Regardless of how well the Athlon's x87 FPUs are, the instruction set itself should keep it from reaching the performance of comparable RISC chips (indeed, CPUs like the Alpha 21264, and PA-RISC have no trouble trouncing the Athlon at FP.) Nevertheless, it was brave of Motorola to release these numbers. They've probably grown tired of Steve Jobs' fraudulant Photoshop demos.
The primary use for a SIMD instruction set such as AltiVec is for games and multimedia. What this means is that unless the G4 suddenly improves substantially in these areas, there will really be no point to it. Hands up all of you who think that that G4 is going to compete with the AMD Athlon on games performance?
It has also been disclosed that the new G4 now supports "speculative execution" (where the processor executes instruction before it knows for sure that the code path will be taken.) Oooo aaah. Sounds neat doesn't it? Guess what folks this technique has been used in x86 processors for years. Both the K6 and P6 cores perform fully speculative execution (meaning there are no caveats or strings attached -- the degree of speculation is as deep as the instruction buffers will allow, which is very very deep.) It goes without saying that the Athlon also executes speculatively. At best this is catch up on the part of Motorola.
I am not alone in laughing at the new G4 (especially versus the Athlon.) Take a look at http://www.ugeek.com/readercomm/091999/comm101.htm for another clear headed opinion about the new G4 versus contemporary x86 processors.
Here's some more from John Carmack's .plan file:
Apple's new G4 systems.
The initial systems are just G4 processors in basically the same systems as the current G3. There will be some speedup in the normal C code from the faster floating point unit, and the Apple OpenGL has AltiVec optimizations, so framerates will improve somewhat. The limiting factor is going to be the fill rate on the rage128 and the bandwidth of the 66mhz pci bus and processor to main memory writes. The later G4 systems with the new memory controller and AGP will have better performance, but probably still limited by the new 3D card. After Apple gets all their driver tuning done, it will be interesting to try running timedemos at low resolution to factor the fill rate out. Apple has a shot at having the best non-geometry accelerated throughput, but it will still be tough to overcome a K7 with an extra hundred or so mhz. On a purely technical note, AltiVec is more flexible for computation than intel or AMD's extensions (trinary ops), but intel style write combining is better for filling command buffers than the G4's memory streaming operations.
I'd say Carmack is being optimistic about the G4 systems. Trinary ops do not translate to any real world performance gain that cannot be overcome by hardware register renaming and ordinary software based scheduling techniques (although Carmack is careful not to claim any performance advantage from this). The Mhz differential with the "K7" is actually 200Mhz. The x86 OpenGL implementation has been performed by ex-SGIers (Michael Gold, Mark Kilgard, etc) whose degree of optimization is unlikely to be matched by any Apple implementation, so I don't know why Carmack thinks the G4 has any chance of even matching x86's at the same Mhz on Quake 3. In any event, he does confirm that the rage128 is just too slow, that 66Mhz PCI (as opposed to AGP) does not cut it, and the current memory controllers in the G4 are pretty bad.
Subject: Apple sues sky: claims color scheme stolen from iMac From: "H.W. Stockman" Newsgroups: comp.sys.intel Apple Computer interim supreme commander Steve Jobs announced today that he was suing the sky for infringement on the stylistic design of the iMac. "The translucent blue color is a clear ripoff," said Jobs. Apple also announced its plans to sue numerous manufacturers of toasters. "The rounded corners and approximate size of the so-called `food-appliances' are clearly inspired by the iMac," said Jobs, as he gazed lovingly into a mirror at an image he referred to as "God". Also cited for possible style infringements was the bottler of Blue Nun Liebfraumilch. "The bottle is almost exactly the same height as an iMac, the name `Blue' is clearly an infringement on our color scheme, and the bottle is also translucent. Don't tell me that's a coincidence." http://www.news.com/News/Item/0,4,40716,00.html?st.ne.fd.gif.e
The following poll was taken on an Apple oriented site (which uses HTML code that is not Opera compatible -- sigh):
The results, when I checked (08/01/99), were as follows (html changed for syntactical correctness):
| Click Here to create your own FREE Poll Now! |
Companies: Advertise on freepolls.com. 2000 FREE Impressions!
As of 08/02/99, when Intel releases their 600Mhz Pentium-!!!, Motorola/Apple will be 150Mhz behind. (The AMD Athlon is supposedly also due out soon, and will reach 600Mhz at introduction.)
Update: Holy cow! AMD surprised everyone by introducing a 650Mhz version of the Athlon with greater per clock performance than any Intel based CPU. So that puts Motorola 200Mhz behind the current x86 leader. (This is starting to look like game set and match soon at least for the PowerPC processor.)
Oh while I am here I might as well make fun of Apple's high end G3 specs: P-!!! Xeon 550Mhz with full speed 2MB L2 cache is available, Xeons support up to 4GB of actual on board memory, ATI Rage 128's are soundly beaten by: nVidia TNT2, TNT2 Ultra, Matrox G400Max, and 3DFX Voodoo 3000, which are options on any x86 based PCs -- in addition to other super high end cards like "Wildcat" and "Oxygen" video cards that are not available for the Mac.
450Mhz Pentium's are kind of low end by today's standards. So making comparisons to it are kind of silly.
It appears as though Apple has settled a class action lawsuit regarding some disputed charges for technical support. You can read about it on Apple's site.
05/05/99 I recently got to meet an employee from Xponential who was intimately aware of what happened. First of all a recap: back in 1996 Xponential had developed a 400Mhz PPC part that they were able to crank up to 533Mhz in a technology demo. This was back when x86's and PPC's from IBM/Motorolo were running at no more than 300Mhz. The inside scoop was that the 400Mhz processors really only performed like a 250Mhz part (mostly because of BUS limitations) in real world tests, though it demonstrated fantastic performance in the now coveted "Adobe Photoshop tests" that Apple pushes so much these days.
Anyhow, as is well known, there is a serious market for PC's which are quantified purely by clock rate. That is to say, even if the CPUs didn't actually perform better, they would still have sold with customer perception being that they indeed were faster. Somehow Apple, under Gil Amelio, missed this -- egged on by IBM who was promising some fantastic new PPC part that would blow away the Xponential and the then current 604e's that Motorola was shipping (IBM never shipped this part). Xponential lost Apple's business, most of the investors pulled out and soon went belly up -- though not completely.
After selling off its patents (which had implications for Intel on Merced's intellectual property) in a highly publicized auction (which S3 bought for $10million from which they incredibly extracted a broad patent cross licensing agreement with Intel -- a gigantor company at least 10 times their size.) Xponential decided to take one final go at it by selling their CPUs to the Mac clone manufacturers (you remember them right? Power Computing, UMAX, etc.) Just as the clones were poised to go public with their highly clocked Xponential driven boxes, Jobs stepped in and pulled the MacOS licences out from under them.
Why does this matter you ask? Its very simple -- there is a single chip model in this industry that has awed all who have understood it. -- the DEC Alpha. Although a marketing failure, there is no doubt of its technological success. The first generation was an in-order single execution fully pipelined ho-hum processor, the second generation was a 4-way super scalar architecture, and the third generation is a fully out-of-order super-pipelined multi-ALU monster chip. Throughout its lifetime, the Alpha has always had unbelievably high clock rates and has been either the fastest or second fastest CPU on the market (even though Digital's silicon fabrication process has always been at least one generation behind Intel's.)
What the Alpha model proves is that indeed having a very high clock rate is incredibly important, because what you learn in designing a processor like that you can carry with you even as you improve the architecture from generation to generation. (The mistake that PPC advocates always incorrectly and irrelevantly cite is that architectural performance can be more important than clock rate -- what this misses is the fact that clock rate scalability is in fact just as important as architectural performance, furthermore, having a high clock rate does not preclude having a good architecture -- ergo the 21264 processor.) This is something that the Xponential folks clearly understood. Apple showed their myopia in not having enough vision to try to support the Xponential guys. With the Xponential technology, by now they may have had 700Mhz+ cpus with improved architectures that really were twice as fast as x86 CPUs. That kind of umpf could have significantly expanded Apple's product line and made it far more compelling than the x86 boxes out there today.
So why can't Motorola simply pick up where Xponential left off and build some high clock scalability intellectual property and bring this idea to reality only a few years later? Its like all technology -- blink and you miss it. With the purchase of Digital, the Alpha team members have basically disbanded, however, many of them have landed at Advanced Micro Devices and Intel. To put it mildly, its too late. Both Intel and AMD have demonstrated 1GHz parts -- and not just some technology BS, but actual working parts (both companies are committed to shipping such CPUs by the year 2000.) Motorola doesn't have a prayer of catching up anymore. The disparities in the clock rates today between x86s and PPCs are only going to get worse -- much, much worse.
Apple is currently faced with a crisis. It may not hit them today or tommorow -- but the fact is that neither Motorola nor, IBM can keep up with Intel and AMD which are going to be going like banshees with their incredible clock rates and incredible CPU architectures, all in affordable PCs. Apple will soon have to face the fact that their PPC boxes will be serious piece of crap compared to the contemporary x86 box. And no, AltiVec will not save them.
I hear rumours that Apple is looking to switch to the x86 platform -- but how do they do that without admitting that they've lost? As usual they look like they are totally missing the boat -- the companies Connectix, VMWare, and possibly TransMeta could probably solve their problem by developing technology for high quality and high performance emulation of PPC code on an x86 processor. This could allow them to develop an x86 based version of OS X, while retaining backward compatibility with the Mac. But I have not heard a single peep from anyone indicating that anything like that is going on.
Jobs has proven that he is a resourceful guy as well as a master at marketing (perhaps almost as good as Gates himself) but what will he do when he is faced with a really hard problem like this one? We shall have to wait and see.
At the MacWorld expo in San Francisco recently Steve Jobs has finally made announcements from Apple that I could not laugh my ass off at. This is what Steve Jobs announced (that I care about):
OS X server - its vapourware no more, its an actual product.
OpenGL & ATI Rage 128 support.
Conectix Virtual Playstation product.
Dropping SCSI in favor of fire wire.
Whoah. Finally -- after years of useless baloney out of Apple, some actually credible, real products.
First some comments: (1) PPC's are not faster than x86's -- Adobe and Byte just have interesting concepts on how to write fair cross platform applications. (2) The ATI Rage Pro in the iMac's is also in some PC systems, and is generally considered among the very slowest cards in the PC market (nVidia's and Matrox G200's have no trouble in trouncing the Rage Pro). The Rage 128 is very fast, however it will be available very soon on the PC's as well -- clearly Apple just did a positioning deal with ATI for product lauch, that will not deterr the PC's adoption of it. Also, some PC power users will probably like 32MB's of ram or more in their versions.
I'm not sure why Jobs wants to continuing misrepresenting the facts like this.
Anyways, but when we cut through all the crap, and baloney that Jobs feels he must be a part of, he was left with a few technologies that there can be no denying is actually good for something.
OS X represents an answer to Be, Linux, and NT all at once. Now, I don't know many details about it, but if it sprang from the UNIX-like NextStep, then it probably is a serious no-nonsense OS.
OpenGL means that all the id games, and lots of serious 3D graphics work can move to the Mac if so desired. The pressure for other vendors (consoles, digital TV's and so on) to move to OpenGL so that they can develop on a Mac or a PC will grow, and the competition to MS's proprietary and controvertial Direct X will be more credible.
SCSI is stupid (too expensive for the irrelevant benefits it provides). Glad to see Apple dropping it. My reading on fire wire is that its state of the art right now.
The new G3 form factor is actually quite nice (as opposed to the iMac which is just a joke). Being someone who lugs my PC around every now and then -- that actually seems like quite a good idea. The handles seem like a winner to me. (Every time I lug my PC around, I'm scared I'm going to drop it.)
Ok, so does thing mean Apple is going to start taking serious inroads into the "Wintel" market with these machines? I don't think so. They are still limited to a market of people who don't understand why the Mac OS is a piece of crap, but they are moving in the right direction. If they want to launch a credible attack against -- well at least Microsoft, they need to take a few additional steps:
They need a way to lose the Motorola dependence. The PowerPC is a dead end. I'm not saying that they have to go x86, after all they could try for the Alpha, but just lose that ridiculous PPC processor. It has no future. 400Mhz towards the later half of 1999 will be useless -- there are a couple serious CPUs that will be in the 600Mhz-800Mhz before the end of the year. Anyone who understands the current state of architectural scaling of CPUs knows that the future is in the clock rate, not in any claims of "better CPU design" (Pentium-II's and CPUs from AMD are actually better designed than Motorola PPCs anyways.)
Make sure they are able to support lots of graphics vendors (ATI Rage 128 won't be the fastest forever -- ATI is notoriously mediocre at delivering a consistent well designed architecture that scales for any appreciable amount of time as are most other graphics cards.)
Well, they'd better bring down the price of OS X server. I'm sure they are limiting the volume so that they can use the initial customers as unofficial beta testers (that's what this industry has become) but $999.99 is a bit much for an OS even if it is for servers. Remember that the competition is MkLinux (which costs significantly less).
Apple just may be making a real serious come back.
So should I eat crow? Well in my defense you have to consider the facts: after the introduction of the Macintosh, there has been, in my opinion no serious technology innovations from Apple. That would be a decade of languishing while Bill Gates hobbled together stuff to catch up and leave the Mac in the dust. I've had absolutely no real feedback that suggested that anything would happen to really change that. Perhaps, I should have given Apple's iCEO (or at least the team he brought over from Next) more credit.
The fact is these developments have all come to a head very recently which is about as quickly as Jobs and his Next team could have reasonably pulled together their substantive changes. (As opposed to translucent boxes, slashing prices, fraudulent performance claims, and slapping a disk drive and internet search engine together are about all they could do immediately.)
And the situation has even gotten better for the Mac line ... I have recently gotten information under NDA that indicates that Apple will be taking a much more radical step to ensure the future of the Mac line. I am very sorry that I cannot disclose this to anyone until its public info ... Hmmm lets go see if its on that moron "Makido"'s site ... Well I can't find it. (That guy is such an idiot -- Intel lost their market share to Advanced Micro Devices, not Motorola/Apple, Intel's reference compiler comes with VTune which is about the same price as a compiler, and Gil was clearly swindled out of his job by Jobs who it looks like is on a mission to "save Apple") Looks like you will have to wait for it.
So I'm playing Apple's pride and joy "Quick Time" because my alternative is Real Player, which I haven't registered, so I can't save Real Player files. It turns out that unless playing in the foreground, the "Quick Time" player refuses to play audio. Uhhh ... there's absolutely no reason for that, AVI's can play concurrently in the background no problem. I suspect Apple is just artificially limiting QuickTime to keep it from appearing to function any better on Windows than it does on the Mac (where I'm guessing it has the same problem -- but only due to an OS limitation since there's isn't really any such thing as a concurrent task.)
Anyways, so that's not such a big deal, but in that case I would like the option of forcing the Quick Time player to the foreground (Windows apps can be programmed to have a "force to foreground mode".) But there is no such feature.
What the hell? No wonder Apple is so eager to push their "standard" onto Windows. That way they can ruin the whole OS experience through active limitations in their so called standard. Bastards.
At the recent Microprocessor forum (and following the Intel developer conference) Intel gave the details of the KNI architecture.
At the "Makido" site there is an article attempting to detract KNI in favor of AltiVec. Here is my rebuttal:
Remember that MMX + 3DNow! has been around for quite some time now. Your so called "rush to market" has Motorola and its VMX/AltiVec vaporware practically twisting in the wind. These technologies are actually here, ready to be used and showing tremendous results (John Carmack says Quake on PPC is the same as Quake on a P-II (theoretically), which we now know is a lot slower than the 3DNow! implementation.) Intel is late, and so is Motorola. In that sense both KNI and AltiVec are failures.
Now specifically onto KNI vs. AltiVec. Your "3 register operation" nonsense isn't worth the bits you wasted on your web site to talk about it:
PPC 750 + AltiVec (issue rate of 2 instructions per clock): AltiVec_MUL reg0, reg1, reg2 Katmai (issue rate of 3 instructions per clock): mov reg0, reg1 ; This instruction is essentially free. KNI_Mul reg1, reg2 ; SW swaps meaning of reg1 and reg0
To counter the "few registers" argument, the P-II architecture has direct load forwarding meaning that redundant loads (with other unretired loads) take 0 clocks in the same way as above. I.e., the extra "loads and stores" you are talking about are irrelevant. Since the need&$47;desire for 3 operanded calculations are very rare it barely scratches the extra 50% decode bandwidth of the P-II, while saving lots of otherwise usually unused silicon.
The "too few registers" argument really doesn't hold much water, since the real performance bottleneck of modern out-of-order architectures is limited by the ALU performance.
The same holds for the "multiply accumulate" issue. Under KNI, you simply follow the multiply with an add instruction. While this does occupy two out of 6 decode slots leaving 4 (versus AltiVec's 1 out of 4 leaving 3), it does achieve the maximum unit latency. And with SIMD instructions, latency starts becoming a lot more important than raw throughput.
Also remember, that since the x86 architecture uses a variable length instruction set, the average code length is a lot smaller than similar code on the PPC increasing instruction bandwidth.
Your "ugly" comment has no technological relevance.
Intel, having spent a long time on KNI, and having rumoured to have had higher precision, as well as other features, has ended up with a ALU performance that is essentially identical to 3DNow! So the question is, with such a long development time, how is it that they were not able to improve on 3DNow! ?
Simple -- the architecture was optimized for highest real performance. Not baloney paper and pencil performance like AltiVec. The internals were simplified to make sure that Katmai could reach and exceed clock rates of 500 to 600Mhz. The folks at AMD/NexGen simply arrived at this same "sweet spot" architecture a lot sooner than Intel (AMD has other issues limiting their clock rate though). Intel was beaten by a design team with more foresight -- they were not out-designed. AltiVec is just being beaten much worse (from a time to market point of view.)
Now given AltiVec's rather "aggressive spec" with all the extra silicon required for 3 valued alu's multiply accumulates, log and exp approximations (*laugh* -- who the hell needs frigging logarithm and exponential approximations?!?!?; that is such as ridiculous joke!) how the hell is AltiVec going to achieve its performance goals?
There are two choices for Motorola: (1) They simplify their internal architecture to something similar to Katmai/K6-2 (via double pumping like KNI for example) thus achieving no more performance but allowing for higher clock rates than are currently available in PPCs, or (2) They implement a highly aggressive low latency, fully pipelined and fully parallel architecture but become restricted by clock rate.
You can see how the AltiVec architecture has started to come apart at the seams when you look at its "reciprocal" and "reciprocal square root" approximations only give 12 bits of accuracy and lead to a first iteration accuracy of 1 ulp in only 95% of the inputs (3DNow! is 14/15 bits and is exact for 85% of inputs and within 1 ulp of all remaining inputs.)
In addition, I believe Motorola has announced a die size for AltiVec that indicates that it is very small. This leads me to believe that they have tried to cut corners, and therefore have gone with approach (1). (Fast, large register designs usually require lots of die area.) Also the fact that IBM has announced processes that will take PPC to 400Mhz and beyond indicates that they cannot afford to let AltiVec hold them back.
Sorry, but I just don't see AltiVec making a serious difference versus 3DNow! or KNI. It'll all be about clock rate, and it looks like Intel is winning that race. (If you ignore Kryotech's refrigerated AMD K6-2 500Mhz parts.)
Update: Ok, so I've read through the MicroDesign resources explanation of AltiVec. Rather shockingly, it does appear to be a solid 2 times faster than KNI/3DNow!. Its basically a single issue, fully pipelined, single pumped, 4 wide SIMD unit.
So what's the catch? Simple: It won't be available until Q3 '99 and it will only be running at 450Mhz in a dual issue CPU. x86 CPUs should be well above 600Mhz by that time frame and be triple issue. In other words, Motorola has wasted this huge amount of time designing this AltiVec monster that will drag down the overall performance for sake of a theoretical architectural coup. (And they have missed the entire SIMD market between '96 and Q3 '99 -- ridiculous.)
Hiding behind some claim that Mac users don't want it in its current form, Jobs announced that the next generation of OS's for the Mac will be Mac OS X (meaning version, 10 I suppose.) He then gave some song and dance about how Rhapsody, in its full implementation, will be available some time after that. (Keep in mind that the BeOS, which was an alternative to Next Step, is shipping in Beta form for both x86's and PPC's right now.)
(As you can tell, I've grown so apathetic about what is happening in the Apple world that I didn't even bother to report this when it became news ...)
06/11/98 IBM has sold their interest in the PowerPC processor back to Motorola. This should come as no surprise since IBM has been aggressively licencing Cyrix, AMD and Centaur designs and probably makes far more on them than the PowerPC which they invested so much more money in.
(05/19/98) VMX has been renamed AltiVec reportedly 162 instructions with SIMD capabilities. They target 32 128 bit new independent registers. Following the PPC architecture as I know it so far, this means if they use one unit (likely) they will be able to compute 128 bits of result per clock with a granularity of 32 floating point, or 16 or 8 bit integer. This is no more than AMD's K6-2 throughput which performs 2x64bits of result per clock.
Such a decision to go with 128 bits doesn't make a lot of sense to me, since you are not ordinarily going to encounter that much parallelism in data per operation in ordinary algorithms (except for Photoshop of course; it makes me think Motorola's CPU division is nothing more than a hardware division for Adobe.) More to the point, typical multimedia algorithms do many simple operations (add, sub, average, multiply, saturate) on parallel streams of data, then funnelling them (with an accumulate or select function) to a narrow data stream to perform more complicated functions (reciprocal and inverse square root.) But since AltiVec is orthogonally 128 bits, that means it has to commit all 128 bits of its compute power to each operation, even if its just a divide on a single 32 bit quantity. This lack of flexibility versus the K6-2 (which uses two units which target 64 bits each, but which can be assigned to different operations) makes it seem inefficient by comparison.
Motorola has left the door open to the possibility of having multiple AltiVec units in a PPC, but I find it hard to believe they would do that (instead of seperating their Load/Store unit, which would be much more valuable to them in terms of real world performance.) But then again so has AMD and Intel.
One difference they seem to be very proud of is a permutation instruction which is indeed very useful (as long as they make no attempts at patenting it, since the MPACT media processor, as well as other DSPs, clearly predates it.) On the other hand, they do not offer any SIMD 32 bit integer operations (which are present in the MMX design available from all x86 CPU vendors -- their integer SIMD width is limited to 16 bits.) Also since the new K6-2 instructions (called 3DNow!) are few in number (21) they are very simple to understand and programmers can get to work with them in and understand optimization immediately (I've invested a total of about 2 days, just studying them, and I understand them. Update: I've now spent considerably more time with 3DNow! and I must tell you it rocks -- it certainly beats developing on vaporware.)
According to the tech docs on it (available here) they committed silicon to approximate log_2 and 2_exp functions. I can't possibly imagine an application wanting 128 bits of log_2 or 2_exp result per clock, but hey, its Motorola's die area to burn I suppose. Their 1/x and 1/sqrt(x) estimation functions generate only 12 bits of accuracy, which is 2 bits less than the K6-2 on first iteration. It is not surprising that they include a note regarding numerical instability that prevents it from generating consistent estimations within 1 ulp even after using Newton-Rhapson. They also do not support a built in iterator. So the Newton-Rhapson step must be hand coded by the programmer, into what appear to be 4 additional instructions, versus the K6-2's 3 additional instructions.
In all honesty, I think AltiVec and MMX+3DNOW are about equivalent (the points I make above are kind of nit picky), but the most important difference is, 3DNow! is already sampling in the K6-2, and will be released at the end of May, while AltiVec is slated for Motorola's version of G4 which is not being supported by IBM, and which is not going to be available anytime soon.
Stories such as the one on news.com don't quite get the picture. The AltiVec technology does not offer anything that will not be available from AMD (which will beat them by 6 months (Update: 15 months)) and Intel (whose Katmai processor is slated to released on roughly an equivalent time frame as the Motorola G4. (Update: SSE will beat AltiVec by 6 months))
A little while ago, Intuit announced that they would discontinue Quicken for the Mac. A devastating blow to Apple that would have seriously called into question the role of the Mac as a home computer (even more so that it already is.) But Intuit did an about face shortly after, and in fact has become a bundled package for Apple's new iMac (a translucent throwback to the original version of the Mac, priced at about $1300.)
What does all this mean? Very simple, Intuit has no serious commitment to Apple, and the deal that they were trying to work out with Apple for this iMac nonsense must have been going sour for Intuit, to the point where Intuit felt it had to threaten Apple to get a fair deal. It really looks like signs of desperation from Apple.
A company called CONIX3D has developed OpenGL drivers for the Mac OS. You can find about about it at:
Now the info they give that is important to me is under Performance. They claim:
"On a Power Macintosh 8500/250, 24 bit color, 110,000 light smooth shaded, z-buffered, polygons/sec, and 135,000 light flat shaded z-buffered polygons/sec. Plug-in renderer support allows you to choose the configurations you need for those writing a commercial application. The standard release comes with six standard plug-in renderers representing different color buffer and depth buffer configurations. Plug-in renderers have 50% smaller memory footprint and support 3D hardware plug-ins. Hardware plug-ins will be available very soon."
These are NOT texture mapped triangles kiddies. (That "very soon" comment wouldn't be a Copland-esque "very soon" would it?)
On the PC, hardware accelerated OGL is available. A quick check of the nVidia, RIVA 128 (an affordable PC based graphics accelerator that is currently available) data indicates: 5M triangles/sec, 100M pixels/sec, texture mapped. Being somewhat knowledgable about such things, I know that turning on features such as "smooth shading" doesn't cost anything for this hardware.
While I realize this is a bit of an apples to oranges comparison, even giving any sort of reasonable lee way for OGL overhead indicates that this is just night and day. On the Mac, you just can't get anywhere near the performance of a PC for graphics.
The commercials that Apple is currently running, indicate that the G3 is twice as fast as a P-II. In his keynote address as Seybold, Jobs cited Photoshop (as usual; see above) and ByteMarks. Fortunately, Byte supplies the source code for ByteMark which has allowed Eric Bennet to run the tests on his own.
Here are his results
As you can see, the code compiled with the Motorola/Apple inhouse compiler that we know nothing about, is far and away the fastest performer, but we have no idea if the results are really correct, since it is only a binary, and the ByteMark output is insufficient evidence of correct internal operation.
Furthermore, the results are not approached by any conventional PPC compiler on the market today, regardless of how high the optimization setting is set. (The PPC Linux results are a little unfair since similar P-II Linux results were not collected.) Keep in mind that your conventional Mac applications are not likely to be created from Motorola's internal compiler.
If we dismiss results that cannot be duplicated in conventional applications we find that the PPC and P-II at 266 and 300 respectively are really about the same performance. That's pretty pathetic considering the P-II is being held back by its older "CISC" based roots. (The 300Mhz G3 is no match for the 400Mhz P-II of course.)
Twice as fast is a myth. The real question you should be asking yourself is, how well does the Mac run WinBench and WinStone?
Update: Here's an email I recently received from an insider.
Date sent: Thu, 30 Jul 1998 23:18:46 -0400 From: [Withheld] To: Paul Hsieh Paul Hsieh wrote: > As an aside then, what hell is wrong with the IBM and Motorola people? > They have gone through several iterations of the PPC including the 601, > 602, 603, 604, 620, and 750 lines. How is it possible that they've > implemented them all so poorly? Do they just suck? Not that you heard this from me, but I've heard a rumor that Motorola used the wrong compilers to generate instruction traces to be used for chip designs. I don't know which of the chips were designed by Moto, and which were designed by IBM. I do know that the Apogee compilers (the ones that Moto used) cheated on 023.eqntott, because [incriminating reason withheld] Note, too, that HP once cheated outrageously on matrix300; it will be interesting to see whether the benchmarks used to design IA-64 are an accurate sample of "real programs" or not.
As further fuel to the first read this:
TechWeb reports that Pentiums are faster in real world activities, especially graphics
Here's a warning label I've heard applied to a CD, full of articles, HTML and what not I was reading about recently:
Sorry folks, because of the way Mac's treat ISO9660 disks (reads the directory structure in as a Braindead DOS structure), the long filenames and directories are warped into filena~1.htm, and the HTML expects filenames.html. You can still get to the archives by traversing the CD directories via the Desktop.
Well, Mr. Jobs, your admiration for ISO doesn't exactly extend to your product now does it? This wouldn't be an artifical attempt at making Windows look like it doesn't support long filenames would it?
Here are some tech report calls into Apple, that I think are just hilarious ...
I would like to point out "PaperClip" is also an old piece of software that use to run on Commodore 64's ... but that's neither here nor there.
Announced 02/27/98. Just one question ... how much did that failure cost?
The following was posted on comp.benchmarks:
Subject: Re: ByteMark Bench mark show Macs are faster ? From: email@example.com (macghod) Newsgroups: comp.benchmarks, [...] In article , XpackX@xxxx.xxx wrote: > Chip Mhz Spec95 Millions Size Power Etch > int fp transistors mm2 watts microns > =============================================================== > PII 300 11.9 8.6 7.5 203 .35 > PII 333 12.8 9.1 7.5 203 .25 > 750 233 11.0 8.1 6.35 67 5 .25 > 750 266 12.4 8.4 6.35 67 5.7 .25 I feel so lied to!! The g3 is supposed to be faster than the pentium, even up to twice as fast!!!
Geez, being an Apple user, I figured you'd be used to being lied to. Anyhow, the G3 does appear to beat the P-II clock for clock, but at the top of the line Intel is still ahead of Motorola/IBM.
Here's some more info he sent me:
How about these real world tests... Apple Gateway Power Mac Pentium G3/266 2/300 Photoshop Open 0:08 0:06 Artistic - Fresco 0:44 0:54 Sketch - Graphic Pen 0:27 0:33 Texture - Craquelure 1:04 1:10 Gaussian Blur 3:22 3:03 Resize 0:22 0:26 Premiere Make Movie (w/scaling) 8:16 8:44
Ok, so the G3 wins, but only by the slimmest of margins, on a test rigged to make the G3 look good (i.e. Photoshop.)
Here is an article by David Poque written in MacWorld:
While I personally have no great love of the Windows operating system, the article written above is totally dillusional. Here are my quick counterpoints:
My understanding is that the Windows market share is currently more like 85%, and Microsoft has recently announced (Jan 98?) that total Win 95 sales has exceeded total Win 3.1 sales. Assuming similar piracy rates, that means that Win 95 holds greater than 40% of the OS market share, with Win 3.1 a close second. Apple on the other hand has been held to 6%, as far I knew. While my figures may be different due to when the survey was taken, either way the premise that Windows was on its way down has not been supprted by statistics.
NCs as Ellison envinsioned them are a fanciful dream (that have nothing to do with over priced computers from Apple, BTW.) While the market will certainly be there, they won't be filled with Newtons or cheap Sun based Java boxes but rather by sub-$1000 PCs.
While nobody is a really big fan of this fact, Windows has found quite a home on the internet. IE, MSN, NT, ActiveX and Microsoft's flavor of pseudo-Java are all fairly big business on the net (as bad of a taste as that does leave in all our mouths.)
Windows too expensive???? Give me a break, regardless of price, Microsoft will continue to play their bait and switch "Windows is free with every new PC" strategy, and consumers will keep falling for it. BTW, how much royalties are UMAX still paying to ship MacOS on their clones? How much will they pay for Rhapsody?
Finally, the issue as to whether Microsoft can be saved ... is this guy on drugs or something? Bill Gates is the richest man in the world for a reason; he's 1/3 owner of one of the biggest gargantuine companies in all of the US. He doesn't make his money on the OS by the way; he makes it from selling Microsoft Office. (Which ships on the Mac as well, for what its worth.)
Best Buy Corp, one of the nations largest chains of electronics stores, has decided to stop selling Apple computers, due to slow sales.
A Best Buy spokeswoman said that Macintosh sales accounted for less than 0.5% of Best Buy's dollar sales of computers over the last year.
They will continue to offer some Mac software (for now).
According to a PC World news radio report, people working on the next generation Newton have been told by Apple to start looking for employment elsewhere. Maybe Jobs is just seeing how fast he can drive Apple into the ground.
Although Motorola has not made any anouncements, they are reportedly working very hard on VMX. Rumors are that it is a coprocessor that performans matrix operations in parallel to the main CPU. Sounds interesting, but Intel, AMD, Cyrix, and Centaur have all aready announced their plans for second generation MMX which will perform substantially the same thing as what these rumors indicate about VMX. As to the issue of parallelism, remember that the K6 and Intel CPUs are now "POST-RISC architecture" CPUs which means that they drive all of their units, including their MMX, integer and floating point units in parallel. So we'll have to wait and see, but its clear that Motorola has a major uphill battle in front of them since they will be competing against the current MMX as well the next generation MMX, by the time they are ready to ship.
Here's a system Joe Ragosta (a die hard Mac advocate) put together:
$2250 266 MHz G3 minitower $200 Upgrade SDRAM to 96 MB $500 Quality 17" monitor $300 Ultra fast/wide PCI SCSI $400 4 GB ultra/wide SCSI drive (in addition to 4 GB drive with the machine) $300 Fast video card. Total $3950
Here's the specs on a machine I actually bought:
$875 233 MHz K6 minitower, with DVD drive $35 32 MB of EDO RAM $900 Awesome 19" monitor $75 X2 56K Modem $35 Diamond Stealth 2MB graphics card $15 Cheapo SB-16 $100 100 watt speakers Total $2000
You'll notice I saved $2000, from which I loaned my sister $1500 for a nice used motorcycle. You'll also notice that Joe got a couple of extras that have extremely questionable value. A friend of mine put a system together from scratch; similar to mine w/17" monitor, 33.6K modem, and CDROM drive and managed shaved $400 off the price. Here's my run down on the two systems:
96 MB of RAM - since the PPC is a 32 bit RISC chip, that means its using its RAM half as effectively as a 32 bit x86 chip. There are basically no PC applications that even notice memory beyond 64MB, and basically none that dont work or are unusable at 32MB. While I could have gone to 48MB to be equivalent to Joe's system, I would have actually derived very little benefit from it. (OTOH, I do plan on an upgrade to 256MB in the future; for programming and experimentation purposes.)
SCSI. No matter how you slice it, disks (and external devices in general) are slower than RAM. Having a DISK run twice as fast is irrelevant if your application is reasonably written. The only reason Mac-heads are so big on SCSI is because of the god aweful load times for Mac applications *need* every inch of disk speed just to be tolerable. On PCs this is simply not an issue.
4GB IDE hard drive + 4GB SCSI hard drive; Does this not just make your hair crawl? Why not just get a big 8MB SCSI hard drive and throw out the damn IDE drive if you are so keen on a fast hard drive?!??! Better yet, why bother with all this bollocks SCSI in the first place??
What the hell is a Quality 17" monitor? If you can't see 1600x1200 on it then what's the point? Personally, I'm very happy with the premium I paid for my nice 19" monitor. Its almost as good as the 21" monitor I have at work.
$300 is the price (expected; its not yet available) of a 3DFX VooDoo 2. I very much doubt that any fast video card for the Mac is going to even approach the performance of such a graphics card. Furthermore, you lose again on the Mac regardless of the speed of the graphics card because of its inherently backwards graphics architecture. My crappy graphics card will probably run circles around anything available for the Mac for precisely this reason.
Joe made no mention of modem, or speakers. Do G3's come with them or what?
The 266 G3 might outperform a K6 233 in some situations, but I would have to study that two chips carefully to know for sure. Either way the price differential to a P-II 300 is less than $500 and doesn't come close to justifying the tremendous price differential between a Mac based system and a Windows based system. (I plan to upgrade to a K6-3D 266 or 300 Mhz processor later on this year which will blow away anything from Motorola/IBM, and possibly Intel; I expect this chip to cost me about $400.)
Geeez ... $4000, I wonder if I could get a 500 Mhz Dec Alpha for that price. It would certainly be more compelling than a crappy old Mac!
So while Joe gets a machine that will give him eyestrain, can't pre-emptively multitask, has a lot less (desirable) installable software out there (I got Quake II, and Age of Empires as well; I also installed Gravity, 4DOS, WATCOM C/C++ and Opera which I have owned all along) I've got a Yamaha Virago for my sister and $500 extra cash in my pocket to boot.
Did Joe just put together a bad system? Well, when it comes to Apple based systems, there simply is not enough latitude. Checking Apple's web site I found that the cheapest system I could get was:
Power Macintosh G3 233Mhz Minitower 32MB RAM 4GB EIDE hard drive 24x (max) CD-ROM drive Audio card 6MB built-in SGRAM video memory Total $2350
Notice the missing monitor. Here's the more acceptable system (without Joe's irrelevant SCSI options):
Power Macintosh G3 266Mhz Minitower 64MB RAM 4GB EIDE hard drive 24x (max) CD-ROM drive 100MB Zip drive Audio/Video card 6MB built-in SGRAM video memory K56flex internal modem Apple ColorSync 17" Display Total $4120
The Zip drive minus speakers minus monitor difference is not worth the $2120 price difference. People just don't need to spend more than $2500 for a complete system these days, and lets face it Apple can't match these consumer requirements.
(Update: Apple has recently slashed prices in a reaction to the emerging sub-$1000 PC market. But even so, the cheapest Mac you can get is $1700 sans monitor. This is at a time when companies like Compaq are trying to establish the $850 PC market.)
John Carmack has recently announced that the coding for Quake 2 is finished. What relevance does this have to Mac users?? Absolutely NONE, and that's the whole point. In his .plan file, John Carmack discussed ports of Quake 2:
You might make the mistake of thinking this means that Mac users can expect a Rhapsody version of Quake 2 real soon. But think about it more carefully. Its fourth in the list (they feel SGIs IRIX, and the hacker OS, Linux is more important) and only there because Carmack and crew have a soft spot for the Next OS. Furthermore, it is dependent on the release of Rhapsody, of course. What if they had made the same commitment to porting to Copland?
Obviously, coding for Rhapsody has not begun, and Apple's lack of commitment either way with respect to OpenGL is likely to affect this in a major way. (The folks at id are committed to OpenGL.) Bottom line? Don't expect Quake 2 to be comming to a Mac near you any time soon.
Here's something I picked up off the USENET the other day:
I've been demanding some PC features and I still don't have them, not even with MacOS 8:
-Automatic memory adjustment (RAM Charger is incompatible with some programs)
-Opening menus without pausing the system (Menutasker only allows background applications to continue processing. The foreground application still pauses. And background apps draw over the menus.)
-Universal forward delete (Some people wrote: "TextEdit has been broken for so long that Apple considers the forward delete key to be an 'extra' feature." "All applications use TextEdit, so all applications have the problem.")
-Home and End keys that move to ends of the current line
-Resizing windows from any corner (A couple of years ago, Mac people constantly trivialized this feature. But when Copland news started spreading, these people then started beating Windows users over the head with Copland's ability to move windows by dragging any corner.)
-Menu navigation using the keyboard
-Replacing an existing file in Save As by double clicking the file name instead of having to type it all over again. I would even be satisfied if I could use Option-click, or Option-double click.
Either the operating system has these features or it doesn't. Don't use shareware to cover Apple's ass. Same with Windows lacking certain good Macintosh features.
This confirms my suspicion that Apple crusty old graphics architecture isn't quite up to the task of allowing background applications to proceed as implemented in MacOS 8. This results in graphics bleeding through Windows as described above. This is pathetic. This does not happen in Windows (it can't because of the graphics architecture.)
But some of these other things that he refers to is downright silly. They aren't even all that hard to implement. Except the RAM Charger thing, which I imagine is the name for the virtual memory manager. I've been told the horror stories of how virtual memory is handled on the Mac; jump address patching, no protection ... *shudder* ... its a wonder anything works right on the Mac.
Recently, I've been getting into studying CPUs. Mostly x86 CPUs, however the things I have picked up appear to have general applicability to all CPUs. So I visited Motorola's web site in search of details about the newly unveiled PPC 750, PPC G3, or "Aurthor" CPU. Basically, it is a processor introduced at 250 Mhz with numerous micro-architectural improvements over the 604e processors. The major improvements include:
a 4 issue instruction decoder. (Correction: 2 way decoder)
a more direct L2 bus, as well as supporting numerous clock divides.
a partially non-blocking cache.
an 8 way set associative 64K instruction and data split L1 cache.
Apple, themselves have made a big deal out of this CPU by claiming that the 250Mhz PPC G3 is faster than a 266 Mhz Intel Pentium II. They determined this by the age old discredited test of running Photoshop filters (see my explanation above.)
Now let me explain why all this is utter nonsense.
Intel offers CPUs at 300 Mhz. Nuff said. (Update: make that 333Mhz)
The Pentium II, AMD K6, and Cyrix 6x86MX processors all support MMX (as set of SIMD instructions that basically allow more parallel streamed integer calculations per clock.) The fabled VMX instructions have not shown their face yet, and nothing inside of the PPC 750 approximates these sorts of capabilities. (Folks believing that VMX will be superior to MMX ought to keep in mind that MMX-2 as well as AMD/Cyrix/Centaur's second generation of SIMD technology will be available by the time VMX is designed and implemented.)
This new PPC has only 4 execution units: Integer 1, Integer 2, Floating Point and Load/Store. This means for that the instruction decoder to actually acheive its purpose, code must be written to perform in streams of 2 integer, 1 load or store and 1 floating point in 4 sequential instructions. Even for an optimizing scheduling compiler this is very hard to do on typical code. (Correction: they can only issue to two of the four units at once.)
There is no real advantage over the Intel Pentium II or AMD K6 processors here since they can each also execute 4 or more internal RISC-ops per clock. However, fewer x86 instructions are required, which increases the average RISC-op throughput.
The PPC 750 has only 1 level of speculation. What this means is that cache misses, are devastating to performance. The reason is that if you cache miss in a tight loop, you will only be able to run around the loop once more before it stalls waiting for the first cache miss to resolve to clear. (Clarification: The PPC architecture uses one reservation station per unit, to allow buffering of up to two instructions per unit, and allow limited out of order execution -- the K6 and P-II architectures both use centralized schedulers which allows for a large number of instructions to be buffered before execution.)
Both the Pentium II and K6 processors support for out-of order execution and only sections of code reliant on cache fetch results will stall. Even the Cyrix 6x86MX supports 4 levels of speculation. In theory each of these x86 processors could run around a loop several times even with an outstanding cache miss before they need to wait for the cache miss to resolve. This decreases the number of dead cycles.
The PPC has a combined load and store unit allowing at most one load or store per clock. This is obviously inferior to the Pentium II or K6 processors which have separate load and store units (as well as store buffers) which allow 1 load and 1 store per clock. Against the Cyrix 6x86MX matters are still worse since this CPU allows 2 loads, 2 stores, or 1 load and 1 store per clock.
In fact, in the Motorola documentation, there is absolutely no mention of instruction reordering buffers, retiring, or any sort of "holding pattern" instruction buffers. That is to say, the PPC 750 is not capable of out of order execution (Update: that's not quite true, but it might as well be -- see comment about centralized schedulers above). In a nutshell, the advantage of out of order execution is that instructions that were stalled on previous clocks are able to execute simultaneously with instructions on later clocks. This improves overall parallelism and therefore actual overall execution throughput.
There is no mention of call/return branch prediction stacks. Uuuh ... excuse me? Why the hell not? Its cheap and simple to implement, its benefits are obvious, and all the new x86 based CPUs have them. This just underscores the lack of serious engineering on the part of Motorola. Don't tell me -- they must use a link register based scheme (and lose many simple prediction opportunites as a result.)
Their L2 cache bus mechanism does not seem to have any advantage over the Pentium II's local L2 cache architecture. AMD has also announced plans to integrate their L2 cache on chip taking L2 cache performance to even higher levels. (Update: Intel's Celeron and Xeon lines have paved the way for more aggressive L2 cache architectures that make the PPC's look positively anemic, and AMD is about to introduce the K6-3 which continue to raise the integrated L2 cache bar.)
So as you can see, the PPC 750 is not only inferior to Intel based CPUs, but all the Intel clones as well!
Even after an incredibly successful initial outing into the Mac arena, Power Computing has been feeling the pain of licensing Apple for their garbage OS just to sell their platform's. As such, they simply sold off their Mac building resources to Apple, and have moved totally over to the Intel platform. They have in fact been working very close with Intel and simultaneously introduced their first portable PC platform with Intel's introduction of their Tilamook, using said CPU.
Motorola and IBM are expected to similarly abandon, at least the Mac OS and Rhapsody, perhaps endorsing BeOS, PIOS or Linux. (Update: Indeed, both have abandoned the Mac platform, leaving only UMAX with the asian market.)
After it has been out for over a year on the PC under DOS, Windows, Linux, and on a variety of other platforms such as Next, and SGI boxes, the Quake port for the Mac is finally completed. That's the delta folks; one whole year. I hope it was worth the wait. BTW, Quake 2 ships this Christmas, and Unreal will be released early next year -- for the x86 platform, of course.
Here are some comments about MacOS from the infamous John Carmack regarding a possible port of Quake2:
Update: I have purchase, played, and finished Quake II on my PC. Its sooooo cool. Its a significant improvement over the original Quake in terms of graphics, and game play. Too bad for Mac users who have to wait ...
In the words of one of my collegues, "There's something just makes me sick to my stomache at the prospect of Microsoft bailing out Apple". Of course, nothing serious has changed inside of Apple. Same developers, no direction (no CEO) same market share. In the end the only thing that happened is a check box changed that has made Internet Explorer the default browser for the Mac.
This reality didn't keep their stock from doubling over value just before Amelio was forced out. (To put things in perspective, Apple lost $1 billion dollars during Amelio's tenure.) It should be pointed out that at that same announcement (during MacWorld) nothing (absolutely nothing) was said about Rhapsody. You Mac heads ready for another disappointment?
Around the office (mine) it was speculated that Jobs may have done this just to boost the stock price up so that if/when someone (Oracle?) buys out Apple, Jobs comes out even more ahead. But that would imply that Jobs still had a bunch of Apple stock. Either way, it looks like Apple investors are riding a big wave of nothing.
Today, Be inc, demonstrated a lucid understanding of the market by showing the BeOS running on an Intel x86 based CPU. They demonstrated the BeOS running side by side with dual Pentiums in one box, and dual PPCs in another. They apparently seemed comparable. A more interesting challenge would be BeOS vs. Windows NT. (For performance you have to pick BeOS as the winner, but NT still wins in apps and market penetration.) The relevance to Apple of course is that there is now even less of an argument to use a PPC based system (and therefore less incentive to use Apple based computers.)
Man, all I can say is that these guys are riding a huge wave of gusto.
And all the adds make it sound like its just as good as Windows 95! :o) Kind of a humiliating thing to be admitting don't you think? First of all, there is still no pre-emptive multitasking, so MacOS 8.0 is not "as good as" Windows 95. Secondly, isn't Rhapsody right around the corner?? Why bother releasing this OS? Is there some reason the Mac community needs a fall back plan in case Rhapsody is broken, or extrememly late? Nah ... couldn't be.
Anyways, so I went to "Fry's" and decided to check out MacOS 8.0. I swear I did a double take. It looks almost exactly like Windows 95! (Its was flipped around with the icons on the other side of the screen.) Of course the illusion was over once I started using it. Same garbagy inefficient Mac feel, and obvously slow graphics updates. The tiny Window controls made it hard to measure up to Windows 95. Now, who's following whom?
What can I say? Just to rub salt in their wounds. Some people have suggested that Jobs was instrumental in ousting Amelio, which would not surprise me in the least. But few people are looking at Jobs to replace Amelio. As of this writing (07/29/97) Apple is still running around like a decapitated chicken.
(Update: Steve Jobs has been named interim CEO of Apple, but they are still looking for a more permanent replacement 11/23/97.)
Power Computing has decided to enter the x86 business. Obviously on the heels of their very successful entry and basically raping of Apple's market share, they have decided to take on the larger, more competitive but more lucrative x86 market. A good move on their part which indicates that they realize that there is little future in the Mac market and they must diversify or be strangled by Apple's inability to sucessfully drive a technology.
An unnamed investor recently dumped 1.5 million shares of Apple stock, bringing Apple's stock price down to a 10 year low. Incidentally, Steve Jobs was given 1.5 million shares as part of the agreement to join Apple. Shortly after Jobs was hired, Apple found itself trying to dispell a rumor that he would dump all these shares at his earliest opportunity. (Update: I don't know when it was confirmed, but it is now known that Jobs currently has only 1 share of Apple stock.)
Xponential's 533 Mhz PPC
Late last year, a small start-up company called Xponential anounced a design and process for building a 533Mhz PPC based on BiCMOS technology. Such a CPU clearly would represent a technological leap (at the time) over digital's 500Mhz Alpha, as well as any x86 architecture which does not come anywhere close. But their biggest customer, Apple decided it was too expensive and have dropped the project. They have instead decided to stick with 300 Mhz low cost solutions.
While Apple's decision, on face value, may seem prudent, it means that they have no top tier work station to be able to point their finger at. They will not be able to see system characteristics of next generation PCs until it is upon them. That is, they have given up their opportunity to see and present to customers, a window into the next generation of CPUs.
If you meet an Apple pundit who points to this chip as the way of the future (or any 533 Mhz CPU in general) be sure to inform them of Apple's latest decision on the matter. Basically, its not happening.
Incidentally some patents of there's have been recently disclosed (i.e., awarded to them) that suggest that they will be making an x86/PPC hybrid microprocessor. Indeed, an interesting idea, we shall see if Exponential lives long enough for their ideas to bear fruit.
(Update: Xponential has gone into receivership, their patents auctioned off, and employees disbanded.)
The Floating Point Units
As is well known by now, the Pentium II and Pentium Pro CPUs have a chip bug that is related to floating point to integer conversion (commonly known as the "dan-0411" bug). Those in the know have analyzed it and come to the conclusion that the easiest and best solution is a simple exception handler work around in software. This solution is sufficient because it affects such a small number of applications, which primarily are using Fortran compilers, not assembly language, and the compiler can simply implement a work around in its FPU exception handler code.
What has this all got to do with Apple? Well, it brought some PowerPC folks out of the wood work to explain why the bug does not affect them:
> Softwindows uses the native (64-bit) PowerPC Fpu to emulate the 80-bit > intel Fpu. Because of this, almost any program will yield slightly > different results from those of a real Intel FPU. This is probably why it > gives invalid results with the dan-0411 detection problem. Hmmm. I wonder > if Intel will replace my PPro 150 I bought a few months ago. If they have > to replace the older chips, they could stand to lose alot of dough. > > Kyle Okula > firstname.lastname@example.org
That is to say, the old Pentium FDIV bug is fairly minor in comparison with the compromise that SoftWindows imposes on its x86 emulation. It essentially drops the last 16 bits of precision on every calculation, because it simply doesn't have those bits of precision to begin with! The makers of SoftWindows have not, to my knowledge, recalled their product to adjust their software to deal with this problem.
The reason I point this out is that Apple's slightly higher Photoshop performance is probably attributed directly to the fact that PPC CPUs have far fewer bits of calculation to perform in their FPU, and hence can calculate results faster.
But, of course the price to be paid is precision. Is this a reasonable trade off for the PPC? That's arguable from both sides. But in essence, the PPC chip's design is practically optimized for low precision FPU operations, which you can use for filtering operations such as Photoshop is commonly used for. Once the applications start leaving the realm of low precision usage, the Power PC has to be programmed to emulate larger precision calculations at a great cost to performance, while on the Intel FPU you have a higher ceiling (about 5 more decimal digits of accuracy.)
The fact is, Intel (and AMD and Cyrix and Centaur) will erase the floating point advantages of the PowerPC (or disadvantages depending on your point of view) with the introduction of MMX-2 (and the second generation SIMD design from AMD, Cyrix and Centaur) which is a 32 bit packed SIMD floating point calculation mechanism. Being able to do two 32 bit floating point operations per clock ought to put the PowerPC and other CPUs in their place. And unlike the PPC 620, or Copland, or the Xponential CPU (see below) this is not just vaporware. At least in the x86 based PC industry, we know this is actually going to happen.
The above article was written before the introduction of a number of new products as well as a number of other industry developments. Primarily:
Copland officially scrapped; another waste of time, money and effort by Apple.
The introductions of newer/faster processors including the Pentium II, Pentium with MMX, the AMD K6 as well as the 200 Mhz PPC604e.
Apple has broken off talks with Be and decided instead to go with the Next OS, acquiring Next as well as appointing founder Steve Jobs to the board. (And paying a ridiculous amount of money for it.)
While the Mac market share has remained relatively stable in the last year, Apple's own market share is being eaten into by the Mac clone manufacturers! (As predicted by many from the very beginning.) Apple has responded by basically throwing red tape at Motorola, IBM and the rest of the clone makers to keep them at bay rather than continuing to support their clone efforts.
Phillips and/or Apple have decided to scrap plans to incorporate the TriMedia multimedia processor into future generations of the Mac, leaving a big question as to how the Mac will be able to compete against the emerging multimedia capabilities of x86 based PCs.
3DFX has ported their graphics card to the Mac, thus ensuring that there will be at least some comparable 3D on the Mac. (Of course, 3DFX has been supporting the x86 for nearly a year now.)
Apple has released their latest Newton; Is it a huge PDA or is it a small laptop? I don't see what market they possibly think they are targetting. Pilots are smaller (and a hell of a lot cheaper) and most laptops are faster (and run standard operating systems.) True this is a case of seeing the glass as half empty vs. half full, but you don't get away with selling half full glasses in the technology market.
Apple will be spending the next little while promoting the Mac OS once again while they adapt Next for the PPC (project code name: Rhapsody.) They have very little choice as the Mac community's patience was more than tried with Copland which turned out to be a vaporous carrot on a very long stick. At least with Next they will be working with an OS that has been ported at least once (to the x86, no less; so if you are interested in a sneak peak at Apple's new, OS, buy a PC and get yourself a copy of NextStep.) But NextStep was a market failure! Why does Apple believe they will do better with it?
Without a mediaprocessor solution of some kind, Apple will be hard pressed to compete with emerging PC multimedia capabilities on the horizon. Without the TriMedia processor, how will Apple be able to decode MPEG-2 and AC-3 (i.e., DVD)? While the move to Next may be at least roughly comparable to a move to Be from a technological point of view, dropping TriMedia makes absolutely no sense. There are so few competitive solutions.
MacUSER magazine released a comparative article on the Mac versus the PC. Its about as bad and biased a comparison as you could possibly make. (1) They used a single application to measure peformance (Adobe Photoshop) which was ported from the Mac to the PC, and which has numerous base code differences as they eventually admit to in some hidden sentence in the middle of the article. (2) They used an ATI graphics card which has significant negative impact on the x86 PC's performance. A Matrox in the PC would have been much fairer.
Motorola has a web page detailing their PPC 604e (as well as other processors). This chip is the basically the flagship of the Mac and Mac clones thus far. While an impressive architecture, for the price, it does appear to be very close in design to the original x86 Pentium (which has been available since 1993.) Both have parallel FPU units, both can issue two integer ops, both have a branching unit. The differences, in fact, appear to favor to Pentium over the PPC. The 604e Load/Store unit is single issue, as opposed to the Pentium which can pair two Load/Stores under most conditions. As a RISC processor, it suffers from a larger average instruction size (which is not helped by the MacOS's cooperative multitasking which requires that redundant task management code be hand crafted and present in all applications in the system), which means it requires more average bandwidth just to feed the instruction buffers. But don't get me wrong, I'm not bagging the PPC; it does look similar enough to the Pentium, that the right mix of ISA architecture (which I am not familliar with at all on the PPC) might compensate for their slight lackings in the areas I describe above.
Using the PPC is about the only sensible thing about the Mac architecture. It is really such a shame that the Mac is not able to put out lower cost solutions considering the huge price advantage of PPCs over comparable x86s. It is no wonder they are losing share to the clone makers. My recommendation to Mac users is that they strongly consider jumping ship to a clone and run the Be OS. Its available right now, and it is only getting better and better. I've been periodically checking the Be website and am always blown away by the steady, consistent progress they have made; and that's on top of the hype they started with! At least with the Be OS, the future competition against the eventual adoption of Windows NT by the x86 camp it looks like a fair fight. What's more, Gasse has indicated that he has considered porting the Be OS to the x86 platform, (which would be truly exciting, from a cross platform technology point of view as well as a competitive alternative to Microsoft's OSes) but it might be too much to hope for as it is only a rumor.
Apple is no longer among the top 5 PC suppliers. The total Mac market has dropped to less than 6%. See this CNET story for more details. It is interesting to note Power Computing's growth, even in face of the diminishing Mac market. An insider has informed me that in order to earn his bonus last year, Gary Amellio arranged for some creative accounting to show that Apple was profitable for exactly one day (the day that the board reviewed him.) However overall, the figures showed that in fact Apple lost 1 billion dollars last year. Maybe GateWay will buy them for chump change when they go bankrupt. :o)
I found a pro-Apple article written by John Wyncott which tried to debunk the theory of the inevitable demise of the Apple Macintosh in favor of PCs. This article has had wide distribution so I assume I am not infringing on any copyright by including it, in its (near) entirety.
It goes without saying that I am a pro "WinTel" PC advocate. Heck I even advocate it over UNIX, Solaris and other supposedly "cleaner" or more "standard" systems/architectures. That isn't to say I always thought nothing of Apple. On the contrary, up until a few years ago, I thought the Apple solution was a truly viable architecture, and their move to the PPC, while daring, seemed to be a step in the right direction. But as I've been immersed in the industry quite a bit more, and with the recent events and taking a step back, its not hard to conclude that Apple is in serious trouble.
Without rebuttal, his arguments sound convincing, but allow me to examine what he says more carefully. Hopefully this will clarify the entire Apple controversy. Please note that the opinions expressed here, with the exception of that written by Mr. Wyncott, are completely my own. I don't represent anyone else's interest, especially not my employer, in this article. Without further ado:
>Read this: > >PC Experts: > >COMMON MISCONCEPTIONS > >You know who seems to know the least about Macintosh computers? PC >Experts. I'd always figured that these PC experts were pretty plugged-in >to the computer market in general, but it seems like everything they know >about a Mac they picked up on the street six or seven years ago. > >They've updated their modems, CPUs, RAM, and even their operating system, >but they've neglected to update their knowledge about the one computer to >which they owe their plug-and-play, mouse-driven, CD-ROM-using, >window-clicking world-the Macintosh. So in an attempt to bring these PC >experts into the 90s, I've compiled a list of the most common Apple and >Macintosh misconceptions, complete with a reality check for each. Fax it >to your PC acquaintances; it really spoils their day when they read good >news about Apple.
This is not news. Its just you ranting. Apple hasn't had good news associated with it for quite some time. Think about the last time you heard a press release about the Apple that was going to take them into the future that turned out as predicted? The move to the PowerPC is the only one I can think of and that was quite some time ago.
>1. THERE'S NO SOFTWARE FOR MACS There are thousands upon thousands of >software titles for the Mac, but it really doesn't matter, since most >people are going to use the same six or seven programs anyway. In fact, >the top-selling titles on the PC platform are also the top-selling titles >on the Mac: Microsoft Word, Microsoft Office, Adobe PageMaker, Microsoft >Excel, Adobe Photoshop, Quicken, Netscape Navigator, etc. Even though the >Macintosh Software Directory lists over 12,000 Macintosh software titles, >chances are you're going to use only a handful of the most popular ones >anyway. As the saying goes, "With the PC you've got 50,000 pieces of >software you'll never use, and with the Mac there's 12,000 pieces you'll >never use."
Most of the apps listed above are getting a bit crusty. PhotoShop and PageMaker are not top selling programs on the PC. The PC version of all these programs are updated as required, to meet the needs of the majority market, whereas Mac versions are commonly relegated as an afterthought. I'd like to point out that Microsoft Windows 95 and OS/2 are not available for the Mac (both best selling pieces of software for some time.) Other apps, that people have attached an industry revolutionary status such as DOOM and Netscape become available on the Mac well after they were available for the PC.
>The other side of this misconception comes from the fact that retail >stores like Egghead Software, Babbage's, Software Etc., etc. carry hardly >any Mac software. That's because the vast majority of all Mac software is >sold by mail order. It's always been that way, it'll probably always be >that way. Incidentally, there are over 500 software applications that are >available only for the Mac and not available on the PC at all. Is that >reason to celebrate? No. The reality is: you're probably not going to use >any of those 500 either.
I'm sure there are about 50,000 pieces of software available for the PC that aren't available for Mac. The argument that *I* don't use all them is completely silly. I have the opportunity to use *any* of them, thus having a larger pool from which to draw. Most people own software that didn't make the top 100 selling programs out there. So where do those apps come from? The rest of the software pool.
When it comes to software titles, more is better, because not only does more mean more software diversity, it means more competition resulting in higher quality and more technological/ground breaking innovations.
Quite simply, the Mac's software scene is comprised from a smaller selection of less up to date applications.
>2. MACS AREN'T PC COMPATIBLE In fact, Mac users can pop an IBM-formatted >disk into their floppy drive, read it, write to it, save files in PC >formats, and even format a floppy disk in IBM format right from their >Mac. This feature is built-in on Macs.
They haven't got much choice, most of the data out there is saved onto PC compatible media. They have to allow Mac users to draw from the deep data archives of the PC world; it just plain damn so much bigger than the Mac's data archives.
>Do PCs come with this same >"compatibility?" Hell no! They couldn't mount a Mac disk with a saddle.
Of course they could. They just don't care to. There's nothing worth while to get from the Mac arena so why bother investing in the technology to do so?
>What about PC apps? Mac users can run Windows or Windows 95 applications >on their Macs using SoftWindows software from Insignia Solutions.
Thus rendering the latest PowerPC with all the blazing speed of a 10Mhz 286. I'm not kidding folks. You should see a demonstration of this. No version of Windows that I've ever seen, has run this slow. Furthermore, Insignia Solutions has not yet come out with a Window 95 emulator. The means the new crop of 32 bit pre-emptive multitasking applications cannot be run on your Mac. (In fact, the entire concept of "pre-emptive multitasking" is a completely foreign idea to the Mac. It has been promised in something called "Copland", but so far its vapor-ware.)
>You can even buy a Mac with an actual PC-board built right into the >machine, so you can have both a Mac and a full-blown PC together in one >box.
You mean to say, you can buy two computers, and shove them into a single box. Great. That doesn't justify the purchase of the Mac.
>[...] The >reality is: Macs are the most compatible computers on earth, and the only >computer than can run Macintosh, DOS, and Windows applications.
They can't run Win 95, Win NT, QNX, Linux (yet) or OS/2. PC's can. Besides, I think you are neglecting the fact that Amigas can run 68K Mac software, Amiga software, Atari ST software, and PC software.
>3. APPLE IS JUST A NICHE PLAYER IN THE COMPUTER MARKET It's true. Their >tiny little niche has made them only the third largest computer maker in >the world. IBM is just behind Apple as the fourth largest. Funny, I never >really hear IBM referred to as a niche player, even though they sell >fewer computers than Apple. I wonder why that is? Hmmm-only Compaq and >Packard Bell sell more computers than Apple, and then just barely. In >fact, during certain quarters of the year, Apple has outsold both Compaq >and Packard Bell. This niche misconception is perpetuated by the national >media's outwardly biased coverage of Apple. By the way, what kind of >computers do most computer journalists use? Hmmm. Makes you stop and >think. The reality is: Apple is one of the top three computer companies >in the world and has a market share most Fortune 500 companies would kill >for.
Yes, well their current ranking isn't going to help their yearly bottom line.
>4. THE PERSON THAT DESIGNS APPLE ADVERTISING IS A SERIOUS SUBSTANCE >ABUSER This is another widely held misconception, but it is also not >true! [...deleted...]
I actually had never heard this before. But ads are just ads. They don't make the computer any better or worse.
>5. MACS ARE MUCH MORE EXPENSIVE For years Macs were a premium item and >considerably more expensive, but not anymore. Mac prices have now come >down to where they are very competitive with most PC prices.
Yes this is now well known. What I am wondering is, since the PowerPC CPU is substantially cheaper than the Pentium CPU (at least they used to be) how is it that Macs are not being offered that seriously undercut PC prices? We ought to be seeing PPC based Mac with similar SpecINTs and other capabilities at hundreds of dollars less than a similar PC right?
I've never seen it. Macs are competitive, but never priced *below* the comparable PC. What's the point of entering a price war if you don't win it?
>[...]But I want >to warn you: when you head down to the computer store to compare price >tags, the PCs may still seem cheaper at first. Until you look under the >hood and realize that Macs come with standard features like a built-in >sound card, a built-in video card, and built-in networking hardware and >software. For many PCs, these features are add-ons which have to be >purchased separately and then installed. The reality is: If price, not >value, is your only consideration when buying a computer, maybe you >should buy a PC. But if you're a bit more discerning, and want a computer >with more standard features, you may have to pay a couple of bucks more. >But at least now, it's just a couple of bucks.
You can buy expensive PCs too if you want. That's the whole beauty of PCs. There's a whole range with various capabilities. If the Mac has an internal sound adapter, then that would suggest that its not naturally upgradable, or that at the very least, there is no market for Mac Sound cards because there is little incentive for 3rd party developers to build them. I'm wondering how Mac's are going to deal with DVD, MPEG-2 and the dolby AC3 standards coming down the pipe. At least on an x86 PC, you know that one way or another someone will figure out how to do it.
>6. MAC USERS ARE ARROGANT We're not arrogant. We're frustrated (and a >little bit spoiled). We've spent our entire Macintosh lives defending our >purchase.
Gee, I wonder why that is. (See any of above.)
>[...] We've been faced with all the misconceptions listed here, the >teasing, harassment, and outward media bias against Apple and the Mac for >years, all the while knowing we're using the best computer on earth.
Nonsense. Here's an imitation of me trying to use a Mac:
"Ok, let me start up the Paint Program. Ok, cool could use a few more tools but I'll live with it. Wait a sec. Why did it go full screen? How to I switch to the Web browser? I just want to ping pong between this and my web browser so that I can sort of edit and upload the picture as I go. What do you mean I have to save the damn thing on the clipboard, exit, run NetScape, edit my web page, exit Netscape and restart the Paint program. That's absolutely ludacrist! Ok, never mind. Lets just bring up SoftWindows and I'll do it from there. Ok here it is, ... hmmm ... this is taking an awful long time ... *snore* ... ok, there is it. It went full screen again. What is it with these Mac Apps? Must they all go full screen? Wait, how come its not working? Oh wait a sec there it goes. *Gosh* that's slow. Ok, ... oh wait, how will I get my work over to the Mac again? The damn thing went full screen on me! How am I going to save my work so I can use it with other Mac apps??!?! Aaahhh!!!!!!!"
The scenario I describe above is not just a fanciful dramatization. I am web author (as I am sure must be clear.) My concept of web page authoring is the ability to take content for a variety of sources, edit them in their own content specific editors and simply combine them together. I don't do this as a step by step process, but rather by putting things together in arbitrary sequence, with a desire to check by results, via feedback from partially completed output. To attempt this on a Mac is laughable.
I need to scan and edit my graphics in one Window, using my favorite graphics editor of course. I need to pull and combine text and data from my word processor, text editor, spread sheet and newsreader. I may also capture a desktop in order to grab the output from some graphical application. Finally I need my web browser up to preview my results (I also use it to swipe URLs and content from other people pages.) Under Windows 95, I can (and regularly do) do this without closing any windows. I can also choose my favorite apps to do each without a hitch. I can also playback .MOD files or listen to real audio at the same time. Pre-emptive multitasking is not feature I or anyone else should be willing to give up. All I can say is, web authoring for the Mac must be nearly impossible.
>Microsoft knows the Mac Operating System is the best, too! That's why >Windows 95 looks and acts like it does.
Well, from my point of view, Windows 95, acts and looks like OS/2 while delivering pre-emptive multitasking (which I use constantly.) From what little I've used of Macs, Win95 feels and acts completely different (its actually feels useful!) On the other hand Win 95 does seem to feel an awful lot like OS/2 (a very good OS that fell victim to IBM's brilliant* marketing.)
Bill Gates (note that I don't hold him in very high esteem, though I am impressed with his degree of success) recently made a comment which I think is so perfectly fitting: "I can't believe Apple has sunk billions into research and have so little to show for it." The MacOS of today is the same obsolete architecture they've been using since 1984. A GUI alone does not define an OS.
>Believe it or not, I've actually had arguments with PC experts that try >to convince me that Apple copied Windows 3.1!
I've had arguments with Mac users who think that Apple invented the concept of the GUI.
>[...]The reality is: we're tired of trying to make >Microsoft-brainwashed PC users see the light. If that makes us come off >as a bit arrogant, so be it.
I have to admit, I've never ever believed in stereotypes until I started talking to Mac users. You guys actually are all nearly alike! I've had email and USENET flame wars with dozens of Mac users none of whom offer arguments the go beyond Mr. Wyncott's. In fact they all sound nearly the same.
One thing that pisses me off is that some of these Mac users try to tell me that the Mac is capable of pre-emptive multitasking (or other such thing that the Mac has never been capable of), and because I don't follow Mac technology that carefully I don't always know whether or not they are telling the truth (with respect to the state of the art.) So I cannot debate them on the point. Then I find out later on that the idiot had no idea what (s)he was talking about and by then its too late for rebuttal since the thread has degenerated to insults and name calling.
In this vein, I'll take the opportunity to set the record straight: Although the PowerPC is a good CPU very capably of supporting a pre-emptive multitasking OS, the Mac OS does not take advantage of it. The Mac's concept of multitasking is limited to event driven and cooperative time slicing the way Win 3.1 did it.
>7. THE MAC IS ONLY FOR GRAPHICS The Mac isn't only for graphics, it's >just that it's great at graphics, and it's no surprise that it's the >industry standard for professional graphic design.
I've heard this. Seems odd since the Mac doesn't offer anything intrinsic about it that would make it any better for graphics. I suspect that since Adobe Photoshop is among the few worth while titles available for the Mac, it tends to get emphasized more, and end users are falsely lead to believe that the Mac is intrinsically better for graphics.
>[...] But the Mac rules in >other areas. For example, PC users are often surprised to hear that the >Mac has 50% share of the pharmaceutical, chemical, biotechnology, >scientific, and engineering computer markets.
Sure. Notably absent are the consumer and commercial markets.
>[...]The Mac also dominates the >MIDI and music industry, and is the industry standard for digital video >production. And needless to say, the Mac has the lion's share of the >educational market in America as well.
That's the great thing about education. You learn about the Mac, and you look at the Job market. Then once you are done you make an educated choice and start learning about x86 based PCs. Besides, its pretty easy to own the market when you given away your product (most schools got their Apples for free or at a substantial discount.)
>[...]The reality is: a Mac is for >whatever you need it for: accounting, word processing, graphics, music, >engineering, multimedia, or surfing the Internet. > >8. APPLES'S GOING OUT OF BUSINESS The national media have been >consistently wrong about Apple's demise and I can't image that people >still put any faith in them. It would be like a sports odds-maker who >consistently picks the wrong team every single time, for nearly 12 >straight years.
I hardly imagine anyone predicted Apple would go under in 1984 or 1985. I think you exaggerate slightly. My suspicion about Apple going under has been fairly recent. (I'd been predicting it since about two years ago, and that Apple would not combust, but rather, fade away in the coming years; I still think this.) As evidence I would like to submit the fact that Apple has been approached by numerous companies with buy out propositions. My sources say IBM, SUN and Oracle have all been interested in purchasing Apple. In 1995, IBM made a bid that was at fair market value at the time, but which Apple rejected. As of this writing that bid is greater than twice the current value of Apple.
I think the players still interested in purchasing Apple are just going to wait a while until Apple's stock bottoms out (which it has not yet) before making another bid that Apple will have no choice but to accept.
>[...] You'd think after a while you might lose a little faith >in his ability to predict the future, but PC users line up to hear the >news.
Unlikely. Most PC users/owners are quite apathetic about the Apple community. Why should we care? Apple has no impact on us.
>[...] The fact is, Apple may get bought out, or merge with another >company, but their going out of business is pretty unlikely.
Yes but as what? If Sun buys Apple, then its likely they'll port Solaris onto it and market it as a cheap Java/Internet box. If IBM buys it, it will likely just stagnate and be mismanaged; personality clashes would cause droves of Apple employees to just up and leave and Apple would fizzle to oblivion. So far I have not seen a really good scenario where somebody buys Apple and simply continues the Macintosh line.
>Here's an Apple stat most PC users would find surprising. In the Fortune >500 listing of America's largest businesses, Apple took the 114 spot >(ahead of household names like McDonald's and Federal Express). So where >does the Fortune 500 list find Microsoft, the computer software mega >giant? About 105 spots behind Apple down at number 219. Is that >surprising? It shouldn't be, since Microsoft is only a $5-billion >company, and Apple is an $11-billion company. The reality is: It could be >much worse for Apple-they could be Packard Bell.
Having money in the bank is not sufficient to make a good product or to hold onto a market. Ask Atari. They just folded recently, even though they had $50 million in the bank. They could have funded any project they wished, but decided the Atari name had been spoiled by too many failures. Of course Apple has more money in the bank and more politics going on in their management, meaning that they will probably be around for a while; certainly they'll outlast their market relevance.
>9. SOFTWARE DEVELOPERS ARE FLEEING THE MAC MARKET Another misconception. >Been around for years, it's just totally wrong. Granted, Apple had done a >totally lame job of working with and supporting developers of Mac >software. They know that; that's why there's Guy Kawasaki and Heidi >Roisen. They're refocusing Apple's attention on developers, and it must >be working, because Apple's most recent Macintosh software developers >conference had a record-breaking attendance.
I don't know of any major companies working on new Mac applications. And when I talk to PC developers, Mac ports are the furthest thing from their mind.
>You now also see more and more PC software developers putting both a Mac >version and a PC version of their software on the same CD-ROM.
I've heard rumors of this, but I haven't seen one of these yet.
>[...]They're catching on to the fact that there are 56,000,000 sales >possibilities. The reality is: There's never been more Mac software than >there is right now.
*Hee Hee Hee Hee!* :) That's pretty funny. Yes, I agree, nobody's software base is actually decreasing. The more interesting question is one of relative growth versus the PC industry.
>Well that's it. These common misconceptions about Apple and the Mac have >been developed over nearly a 12 year period, so don't expect PC users to >change their long-held beliefs overnight, regardless of what the facts >and figures show.
That's quite odd. I certainly did not have a consistent view of Apple for the past 12 years, and I can't think of anyone who has. Up until about 3 years ago, I had admired Apple as being able to hold about 20-30% of the PC market as the single manufacturer of Apple type computers. Their performance was better than a PC, and their OS was a robust GUI that had been years in the development. From the outset the Mac had been an innovative, likable machine.
But in recent years, reality has set in and the Apple solution just does not look viable at all. Apple decided to go with the PowerPC, but the performance to price ratio has not worked out as well as they expected (current top of the line PPC is only slightly cheaper than a top of the line P6 and somewhat slower; and that's not bringing up the PPC 620 fiasco.) The MacOS is completely stagnant and only compares favorably with Win 3.x which has been superseded by the vastly superior Windows 95 (and OS/2). Apple has sunk zillions into Newton, which has not yet seen profitability (though their choice of the StrongARM architecture will probably save the Newton project, if Apple survives that long.) 3rd party hardware manufacturers do not take Apple seriously.
>The reality is: nobody needs Apple to succeed more than the PC user. And >luckily for them, it will.
PC's have not been following Apple for quite some time. Windows 95 is not the answer to the Mac. Its the answer to OS/2 Warp (which for several months before the release of Windows 95 was the best selling piece of software.) Windows 95's biggest improvement over 3.11 is its pre-emptive multitasking architecture. The GUI could easily be retrofitted into Windows 3.11, and MS has proven this with the soon to be released Windows NT 4.0. With pre-emptive multitasking, there's finally a good, sound, technological reason for having a GUI.
Apple cannot justify its use of a GUI beyond saying that it makes it so easy even a dummy can use it. Of course that's its whole problem, it levels the playing field to the point where nobody can get anything more done than the average dummy. Windows 3.x seemed to have the same effect. With Windows 95, I am seeing something different. I am seeing astute developers realizing the potential for a truly pre-emptive multitasking system and writing all sorts of clever apps that take advantage of it.
I had the fortunate opportunity to talk at great length with a former Apple employee about the Mac architecture, specifically in the area of graphics (QuickDraw and GX) and their upcoming "Copland" OS. It brought me up to speed about the Mac in general, and has convinced me more than ever that the Mac is doomed to failure.
In short this is what he convinced me of about Copland: It will not deliver anything resembling true, preemptive multitasking. The hype is just that; hype.
The MacOS's graphics architecture is a mess. Although accelerators are supported, the Mac does not support asynchronous pixel updating in any way shape or form. That is to say, there is no proper mutexing between the graphics device and applications which can obtain direct access to the "graphics port" (the frame buffer and associated graphics state) at any time. This means that graphics operations must be serialized with the the rest of the PC.
Being a part of the PC graphics industry for several years, I cannot understate how detrimental this poor architecture is to overall graphics performance. Asynchronous graphics rendering allows for potentially hundreds of millions of pixels per second fill rates as part of standard operation (though, Windows PC's ordinarily run at an average rate of only about 35 million pixels per second during intensive graphics operations due to some inefficiencies of "GDI") using a fraction of the host CPU bandwidth. But with a serial model, this rate is essentially no better than PCI bandwidth, which could only hope to achieve asynchronous speeds by chewing up 100% of the processor. (This is probably why the BeBox is so much faster than the Mac, but not so much faster than PCs; its has that extra 100% processor bandwidth for free!)
It boils down a parallel processing solution versus a serial solution. As a result, as a gut feel kind of estimation, I'd say that PC graphics are about 5 to 10 times faster in ordinary day to day operations than the MacOS. And Copland does not solve this in any way shape or form.
On a slightly more positive note for the Mac, Apple has recently announced that they would use the Phillips "TriMedia" multimedia processor in future versions of the Mac. Given what I know about the TriMedia, this ought to help the Mac in every area except graphics. But even in the PC space where media processors have shown a little more success, they have not yet proven long term viability.
Another slightly positive note for the Mac (but not for the MacOS): Linux has recently shipped for the Mac. I suppose that with Linux running both x86 based PCs and PowerPC based PCs, that we can finally run some benchmarks and come up with believable comparisons of performance of the PPC architectures versus x86 architectures that differentiate things from a hardware point of view. Hey! It also means there are now twice as many C compilers available for the PPC! :o)Current Apple News