As the hounds close in . . .

Status
Not open for further replies.
It was just a matter of time. I suspect in addition to the civil penalties, what Intel feared the most was the discovery process and the dirty laundry that would inevitably be aired in public as a result.
 
Bout time. This will make things more competitive and possibly drive processor and system board prices down. More bang for the buck features.
 
Wonderful.
I've always been a big AMD fan and I hope they can stay competitive and in business.
 
Originally Posted By: SrDriver
Wonder if in the end the consumer will end up paying the price for this?


Everything does.
However, when you have a monopoly (almost) Intel can charge whatever you want. Their new commercial are kind of cute, but get tiring after watching them one time.
 
Originally Posted By: SrDriver
Wonder if in the end the consumer will end up paying the price for this?


How so? AMD is the ONLY thing keeping Intel honest on prices. If anything, AMD's continued presence and a fresh injection of 1.2B into their operation will allow even more price competition. This is a great day for the consumer.
 
I've built and used far more AMD machines in my day than intel due to price and performance.

I was an avid Overclocker, from the old Thunderbird, Palamino, Barton days. Before that was just the K6 series of proccessors. Pricing was always great, typically the processors outpaced the intel offerings up until recently with the Core Duo's, Core 2 Duo's etc(enhances P3 architecture which was superior to the p4's.).

Now we won't have to really pay an arm and a leg from Intel, and AMD's offerings will improve and pricing will be much more competitive.

Now if only Nvidia would release a Sound Storm II sound card like the onboard Sound Storm on the NForce2 mobo's.
 
I've used AMDs for more years than I care to admit.

But they currently don't have anything that can touch Nehalem.

That's a problem.
 
Originally Posted By: Volvohead
I've used AMDs for more years than I care to admit.

But they currently don't have anything that can touch Nehalem.

That's a problem.


Yep, if you've got extra money to spend or if you want the most power you can throw at a problem (and no price constraints) I would agree that the i7 is the king of the hill. For me, when I balance my actual power needs and give some weight to cost, the Phenom IIx4 starts to look pretty good. The highest binned i7's are fast, but they're not 4-5 times as fast. That's a pretty huge price delta.

From Newegg:

AMD Phenom IIx4 965 = $195.99
Intel i7 975 = $999.99

Best,
 
Originally Posted By: SrDriver
Wonder if in the end the consumer will end up paying the price for this?
The consumer enjoys paying.
 
That's always been AMD's "classic" argument: we're not as quick, but we offer better value.

But in the early part of this decade, they WERE the fastest wafer, and at a fair price.

When I'm building a power workstation or entry server, I do consider raw power as a value-adding factor. A $400-$600 Xeon can still be a pretty good value against a $200 Phenom right now. Clock speed is only part of it.

AMD can null the argument by making a top processor again. Hopefully, this money helps them do that.
 
I didn't realize this:

Quote:
Intel controls about 80 percent of the market for microprocessors, the key chips inside PCs and servers.

My last upgrade was a E5200 Intel - only because xbit labs gave it the best bang for $$ when overclocked. But I could easily be persuaded to go with AMD.
 
Nothing like paying off the guys that copied your processor to begin with.

Intel's market dominance has been primarily due to position and total platform support. The last GOOD chipset AMD made in-house was the 768.

When they discontinued production, they eliminated themselves from being a serious competitor.

Options from VIA, SiS and ALI were JOKES compared to the solid solutions offered by Intel, and made AMD the choice for those who couldn't afford the better Intel solution. This is how AMD created the reputation of being the "cheap" CPU choice. Notice I didn't say value.

In 1998, Intel came out with a functional demonstration of Merced at COMDEX. It was a 64-bit server and workstation CPU. This was a completely different architecture, IA64, and was not backwards compatible with x86. It was later renamed to Itanium, and has NEVER done well in terms of high-end server sales. Its x86 incompatibility pretty much sealed its fate there.

Shortly thereafter, Intel's next foray into oddity was the adoption of the Netburst architecture. At the time, the move made sense. Die process scaling had somewhat plateaued and the idea that the operating frequency of the CPU's made on these dies was going to be capped due to this led to the extension of the processing pipe. Traditional x86 Processors, such as Intel's Pentium 3, used a 10-stage pipe. Intel extended the pipe to 20-stages (and then 31 with Willamette), which then generated significantly less heat, and allowed the processor to scale much more readily than its 8-stage or 10-stage counterparts. The caveat of this of course was that due to the added stages, the CPU was unable to perform the same amount of work per cycle. This is why a processor like the AMD Athlon was never directly frequency-comparable to a P4, as they operate differently. This of course led to the "processor naming" fiasco that both participated in. Since Intel was able to offer CPU's operating at a much higher frequency, this looked bad on AMD. Even though lower frequency AMD CPU's, which were VERY similar to Intel's P3 CPU's, were able to equal, and often surpass the P4 in performance, the numbers looked bad on paper, and uneducated consumers simply assumed the processor with the higher number beside it was faster.

This led to AMD adopting a processor speed rating scheme that had formerly been used by past competitor VIA with their CPU's. It was a "Performance Rating" or PR rating for the CPU, meant to give you the indication of performance relative to Intel's P4 CPU. This is where the "2200+" naming came from.

Intel of course figured they could play the game as well, and so they adopted and even MORE bizarre naming scheme who's numbers had zero basis in ANYTHING performance related.

Behind the scenes of course, Intel had continued to develop the P3 CPU. They had adopted it for use in laptops and had worked extensively on power management. As die process advancements had taken place at a rate much quicker than Intel had initially expected, this allowed this architecture, which initially began life as the "Pentium-M" to not only grow, but to prosper in the mobile arena. It also led to a complete platform development, known as Centrino. This is where AMD was not even remotely close to competitive with their "select" desktop CPU's that were fitted to laptops. Compounded by their complete absence of a platform solution, and AMD's notebook offerings were a joke for anybody who was serious about mobility.

During this time, AMD was able to come up with a 64-bit memory addressing extension for the x86 architecture. This removed the memory cap that existed, and allowed the execution of 64-bit code on x86, whilst still remaining completely backwards compatible with the traditional 32-bit x86 offerings. Since the CPU's did not require special software, and the performance was excellent, the Athlon64 took off. Even managing to garner a small corner of the value server market.

Intel's failure with the widespread adoption of Itanium, and subsequent lack of 64-bit support in the desktop market led to the next logical decision: License the address-extension technology from AMD. And they did.

Shortly thereafter, Intel migrated their mobile solution, now called "Core", which had gone through various incarnations at this point, including Dothan and Yonah, to the desktop arena. It was already multi-core, and was significantly faster than anything AMD had to offer. It also scaled VERY well, due to its mobile roots.

Intel's slowest quad-core offering, the Q6600 was faster than AMD FASTEST quad-core offering. And this is where the trouble began. AMD had been milking the (very successful) Athlon architecture for a very long time. Core took them by surprise. And Intel's resources allowed them to keep the train rolling and i7 followed shortly thereafter, widening the gap. And this is where we sit today.

Thankfully for AMD, purchasing ATI, the Canadian graphics manufacturer, put them in a position to start creating chipsets again. This will hopefully position them in a manner that they can be platform-competitive with Intel from a quality perspective. As their lack of solid chipsets has always been their biggest detriment.



*********************


On another note, this influx of cash is going to widen the already large gap between ATI and NVidia. The latter of which is in a bad way right now since they are not licensed to manufacture chipsets for Intel CPU's, and their former partner, AMD, is no longer interested in his past mistress, since the acquisition of ATI. Between low die yields and engineering FUBAR's, NVidia's future is not looking rosy.
 
AMD's latest quad core is very competitive with the Core2 Duo / Quad generation of Intel CPU. However it seems to be not that competitive with the latest i7/i5 generation of CPU.

Hopefully AMD can cook up something quick, because it seems like Intel knows it can milk us for the premium right now.
 
Graphics is AMD's ace-in-the-hole going forward. It is the one area where Intel has tried mightily for many years, and mainly failed.

As much as graphics play an important role in the desktop now, that role will increase dramatically going forward. Touch and 3d interfaces at the desktop level are going to accelerate incredibly in the next five years. The balance of resource allocation in system design has been inexorably shifting towards the graphics interface for several years now. The GPU is now nearly as important as the CPU. Vista/7 is only the tip of that iceburg. We may see mainstream systems without mechanical keyboards and outboard mice before long. All the smart phones are just a taste of it.

If AMD can effectively develop integrated GPU/chipset/CPU solutions that better leverage their graphics dominance, they can take future desktop market share, even if the CPU component is not as good as Intel's. But I think they only have one or two generations to do so.
 
And with NVidia's future uncertain, Intel may buy up the "ashes" of that company if it does fail, and bolster their graphics capability. From what I've heard, Intel turned DOWN the offer to buy NVidia last go around.....
 
Status
Not open for further replies.
Back
Top