chrmjenkins
Apr 6, 11:36 AM
That isn't what this story reads, and I don't think anyone but you and I have even read the actual facts supposed here.
I actually find this one of the least accurate stories ever posted on MacRumors.com for several reasons... the OP is assuming ULV in the 13" MBA. The OP is assuming that if SB IGP is good enough for MBP it's fine for MBA. There is no rumor or timeframe listing these chips especially not in the 13" MBA. It seems like it's a blatant attempt to stir up activity without any real facts, rumors, or even common knowledge about the chips used in the MBAs.
Certainly the people haven't read the story or they're somehow focusing on the 11" MBA. Sure, this would be fine for the 11" MBA in terms of CPU clock speed but even then it's a gigantic loss in Tue graphics capabilities. That leads to a problem with the author saying good enough for 13" MBP than good enough for MBA. However, the IGP clock speed used in this ULV chip will be nearly a 50% drop in graphics performance. That for me doesn't equate to if this then that...
I am disappointed with MR for even writing such a poor piece of garbage. Forget that I cannot stand the SB IGP... the assumptions made here are absurd! It definitely doesn't warrant this sort of reply from the fans of the MBA. You and I could assailed things all day, but that isn't the story written.
Given Apple's willingness to go with it on the 13", I'm inclined to go with the reasoning that they'll use it here. The argument that it will be a big step down from the 320M is kind of moot given that anyone will say you're crazy if you try to insist that a MBA should be used for anything like gaming or graphical work (read anyone as Apple). You also have to remember that the 320M is downclocked in the MBAs too compared to the 13", so the drop isn't as drastic as you state.
The combination of a lower or equal TDP, a GPU that doesn't need its own heatsink because its integrated into the CPU and the very likely prolonged battery life for the MBA, it's pretty much a done deal for the MBA.
So is that also true for the difference between SV and LV? If that is the case, the Core i7-2649M you cite above (2.3 LV chip) should be faster compared to the 2.3 i5 in the low end Pro 13?
Thanks!
He didn't quite tell the whole story. A LV and ULV chip likely went through different binning as their performance at the same settings varies because the process they are built on varies. The chips that work at the extremes (say Intel's extreme desktop processors or the lowest voltage CPUs they offer) are likely the top performers in their binning tests. Just because a chip can function as a LV doesn't mean it would meet the requirements for ULV, for example. However, if the ULV chip were to be scaled to the LV's parts speed and voltage, it would function just fine.
I actually find this one of the least accurate stories ever posted on MacRumors.com for several reasons... the OP is assuming ULV in the 13" MBA. The OP is assuming that if SB IGP is good enough for MBP it's fine for MBA. There is no rumor or timeframe listing these chips especially not in the 13" MBA. It seems like it's a blatant attempt to stir up activity without any real facts, rumors, or even common knowledge about the chips used in the MBAs.
Certainly the people haven't read the story or they're somehow focusing on the 11" MBA. Sure, this would be fine for the 11" MBA in terms of CPU clock speed but even then it's a gigantic loss in Tue graphics capabilities. That leads to a problem with the author saying good enough for 13" MBP than good enough for MBA. However, the IGP clock speed used in this ULV chip will be nearly a 50% drop in graphics performance. That for me doesn't equate to if this then that...
I am disappointed with MR for even writing such a poor piece of garbage. Forget that I cannot stand the SB IGP... the assumptions made here are absurd! It definitely doesn't warrant this sort of reply from the fans of the MBA. You and I could assailed things all day, but that isn't the story written.
Given Apple's willingness to go with it on the 13", I'm inclined to go with the reasoning that they'll use it here. The argument that it will be a big step down from the 320M is kind of moot given that anyone will say you're crazy if you try to insist that a MBA should be used for anything like gaming or graphical work (read anyone as Apple). You also have to remember that the 320M is downclocked in the MBAs too compared to the 13", so the drop isn't as drastic as you state.
The combination of a lower or equal TDP, a GPU that doesn't need its own heatsink because its integrated into the CPU and the very likely prolonged battery life for the MBA, it's pretty much a done deal for the MBA.
So is that also true for the difference between SV and LV? If that is the case, the Core i7-2649M you cite above (2.3 LV chip) should be faster compared to the 2.3 i5 in the low end Pro 13?
Thanks!
He didn't quite tell the whole story. A LV and ULV chip likely went through different binning as their performance at the same settings varies because the process they are built on varies. The chips that work at the extremes (say Intel's extreme desktop processors or the lowest voltage CPUs they offer) are likely the top performers in their binning tests. Just because a chip can function as a LV doesn't mean it would meet the requirements for ULV, for example. However, if the ULV chip were to be scaled to the LV's parts speed and voltage, it would function just fine.
notabadname
Mar 22, 02:01 PM
The screen is not 50% smaller. Nice way of making yourself look stupid.
Playbook has that elusive flash support out of the box which every apple fanboy wants to hide under the rug.
OS is more eloquent than iOS.
Well, if you are going to tell people their posts make them look stupid, perhaps you should consider your own, and read a dictionary before throwing around three syllable words. Your use of the word "eloquent" is incorrect. "Eloquent" is not a word that applies to a software operating system.
Eloquent: The quality of artistry and persuasiveness in speech or writing; the practice or art of using language with fluency and aptness; fluent, forcible, elegant or persuasive speaking in public.
As in; "Your post was not eloquent".
Playbook has that elusive flash support out of the box which every apple fanboy wants to hide under the rug.
OS is more eloquent than iOS.
Well, if you are going to tell people their posts make them look stupid, perhaps you should consider your own, and read a dictionary before throwing around three syllable words. Your use of the word "eloquent" is incorrect. "Eloquent" is not a word that applies to a software operating system.
Eloquent: The quality of artistry and persuasiveness in speech or writing; the practice or art of using language with fluency and aptness; fluent, forcible, elegant or persuasive speaking in public.
As in; "Your post was not eloquent".
Kevin Monahan
Apr 6, 01:53 PM
I don't believe the mercury engine works on anything but nVidia cards.
Close, but not quite right.
The Mercury Playback Engine is composed of 3 things:
1. 64 Bit Application
2. Multithreaded Application
3. Processing of some things using CUDA (an NVIDIA card)
If you don't have a CUDA based video card, you still have the Mercury Playback Engine (software) available. What you probably meant to say is that hardware acceleration for the Mercury Playback Engine is not available unless it's a CUDA card.
More info: http://blogs.adobe.com/premiereprotraining/2011/02/cuda-mercury-playback-engine-and-adobe-premiere-pro.html
Best,
Kevin
Close, but not quite right.
The Mercury Playback Engine is composed of 3 things:
1. 64 Bit Application
2. Multithreaded Application
3. Processing of some things using CUDA (an NVIDIA card)
If you don't have a CUDA based video card, you still have the Mercury Playback Engine (software) available. What you probably meant to say is that hardware acceleration for the Mercury Playback Engine is not available unless it's a CUDA card.
More info: http://blogs.adobe.com/premiereprotraining/2011/02/cuda-mercury-playback-engine-and-adobe-premiere-pro.html
Best,
Kevin
bokdol
Aug 18, 09:05 AM
what i dont get sometimes is how people get so excited over how these intel machines are better the the powerpc. and most of these are from recently converted mac users. screaming about how much better intel is. but i hope some people out there realize of couse these machine will be fast. it's called technology. it advances as time goes by. a newer topend machine SHOULD be better.
now the question is really how much better should new technology be compared to 2 3 year old tech? was it a big enough jump. yeah the case design is friken awesome. but sheesh all this pro intel babble is foolish. it's like saying my 486 is better then my comodore 64.
or maybe i am just sad that my 1.8 g5 single just went to the stone age...........
and if you guys have old powermac g5 dualcore sitting around because you got a new mac pro. i'll help you dispose of it no problem. i'll even do it for free. ;)
now the question is really how much better should new technology be compared to 2 3 year old tech? was it a big enough jump. yeah the case design is friken awesome. but sheesh all this pro intel babble is foolish. it's like saying my 486 is better then my comodore 64.
or maybe i am just sad that my 1.8 g5 single just went to the stone age...........
and if you guys have old powermac g5 dualcore sitting around because you got a new mac pro. i'll help you dispose of it no problem. i'll even do it for free. ;)
emotion
Jul 20, 02:31 PM
I'm not sure either and I shouldn't have made the assumption. I know Ableton and Cubase do as I've used both and I'm now an avid Ableton user. I'd imagine Logic will take full advantage sometime soon since it's now one of Apple's pro applications. It certainly makes sense considering how bogged down your system gets once you load enough virtual instruments and effects.
I'm a Live user too. I wouldn't assume the forthcoming Live 6 supports more than two cores though.
I agree about Logic and the multi core support. They should have done this for the G5 quads though (I hear the quad owners scream :) ).
Edit: apparently Live 6 supports more than two cores/procs
I'm a Live user too. I wouldn't assume the forthcoming Live 6 supports more than two cores though.
I agree about Logic and the multi core support. They should have done this for the G5 quads though (I hear the quad owners scream :) ).
Edit: apparently Live 6 supports more than two cores/procs
bigmc6000
Aug 11, 05:16 PM
:confused: patent intrusion in europe??? Are you serious? Do you have any examples to verify your claims where a european company violated US patent law and this wasn't enforced by the european judicial system?
Go buy, oh say, Clerks II (or some other movie that just came out) on DVD. It's a hell of a lot easier to find it in Europe than it is here (obviously assumption to you not already knowing where to get it)...
And seriously what's the EU court going to do? "We'll fine you", "No really we're not kidding", "Ok, we fine you!", "Oh, you want an appeal, ok. We won't fine you yet"
(Has MS ever paid a dime of the millions of dollars they've been "fined"??, note I'm not saying the US system is any better but the EU certainly isn't.)
The main point is that, as people have continually pointed out, the wireless technology available in Europe is the same as what's being used in India and China. AKA - the reverse-engineers in China just love to get ahold of stuff that works with what they've got...
Go buy, oh say, Clerks II (or some other movie that just came out) on DVD. It's a hell of a lot easier to find it in Europe than it is here (obviously assumption to you not already knowing where to get it)...
And seriously what's the EU court going to do? "We'll fine you", "No really we're not kidding", "Ok, we fine you!", "Oh, you want an appeal, ok. We won't fine you yet"
(Has MS ever paid a dime of the millions of dollars they've been "fined"??, note I'm not saying the US system is any better but the EU certainly isn't.)
The main point is that, as people have continually pointed out, the wireless technology available in Europe is the same as what's being used in India and China. AKA - the reverse-engineers in China just love to get ahold of stuff that works with what they've got...
NY Guitarist
Apr 6, 11:54 AM
What is the obsession with back-lit keys?
Do you actually look at the keyboard when you're typing?
Yes. I need to see the keyboard. And in a dark room it's critical.
Do you actually look at the keyboard when you're typing?
Yes. I need to see the keyboard. And in a dark room it's critical.
mactoday
Apr 6, 10:55 AM
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
Actually 320m performs better then Intel 3000, so the dude is right that graphics chip in SB is slower.
Actually 320m performs better then Intel 3000, so the dude is right that graphics chip in SB is slower.
NATO
Nov 28, 06:18 PM
I think they'll be a long way off getting money from every iPod sold. For a start its such an illogical thing to ask for (Did the music companies ask for money for every CD player or Tape Recorder sold? Nope), plus I suspect the main reason that Microsoft agreed to pay money in the first place is that they needed to get the music labels on board to boost the Zune Music Store, Microsoft was in the weaker position here and I believe the labels exploited that weakness.
If the labels were to go to Apple and demand a royalty on every iPod and threatening to pull their catalogue if they didn't get it, they would actually come off worse than Apple in terms of lost revenue and it's because of this I reckon they haven't a chance...
If the labels were to go to Apple and demand a royalty on every iPod and threatening to pull their catalogue if they didn't get it, they would actually come off worse than Apple in terms of lost revenue and it's because of this I reckon they haven't a chance...
rdowns
Apr 28, 04:06 PM
Because there was never a question of wither or not any of those men were born in the US, with Obama the past was always a bit hazy as to if he was actually born in Hawaii or thats just what his parents told him. Obviously he doesn�t remember BEING BORN in hawaii..his parents could have just told him that.
But now we have proof and its all over with there�s no need to be calling names about it.
BS, we already had proof from 2008.
But now we have proof and its all over with there�s no need to be calling names about it.
BS, we already had proof from 2008.
Glen Quagmire
Aug 23, 03:32 PM
This will likely suck, because the interconnect Intel is using is just too damn slow. Putting four cores in the same package will just make the situation worse, because a lot of applications are significantly limited by memory performance.
The Woodcrest processors have been put through their paces pretty well on the supercomputing lists, and their Achille's heal is the memory subsystem. Current generation AMD Opterons still clearly outscale Woodcrest in real-world memory bandwidth with only two cores. Unless Intel pulls a rabbit out of their hat with their memory architecture issues when the quad core is released, AMDs quad core is going to embarrass them because of the memory bottleneck. And AMD is already starting to work on upgrading their already markedly superior memory architecture.
In two years' time, Intel will release Nehalem its next micro-architecture - to replace Merom/Conroe/Woodcrest. It is supposed to ditch the FSB in favour of Intel's own interconnect, named CSI. Two years after Nehalem will come another micro-architecture.
In some respects, I'm quite happy to have ordered a Woodcrest Mac Pro, especially if the slow FSB does slow things down when Woodcrest's successor is released. If the Mac Pro can last me three or four years, I'll be in time for the post-Nehalem generation, which should be fairly spectacular.
The Woodcrest processors have been put through their paces pretty well on the supercomputing lists, and their Achille's heal is the memory subsystem. Current generation AMD Opterons still clearly outscale Woodcrest in real-world memory bandwidth with only two cores. Unless Intel pulls a rabbit out of their hat with their memory architecture issues when the quad core is released, AMDs quad core is going to embarrass them because of the memory bottleneck. And AMD is already starting to work on upgrading their already markedly superior memory architecture.
In two years' time, Intel will release Nehalem its next micro-architecture - to replace Merom/Conroe/Woodcrest. It is supposed to ditch the FSB in favour of Intel's own interconnect, named CSI. Two years after Nehalem will come another micro-architecture.
In some respects, I'm quite happy to have ordered a Woodcrest Mac Pro, especially if the slow FSB does slow things down when Woodcrest's successor is released. If the Mac Pro can last me three or four years, I'll be in time for the post-Nehalem generation, which should be fairly spectacular.
LaDirection
Jul 14, 04:36 PM
"Steve Jobs really must have been embarassed after claiming we'd have 3 ghz when we still can't even pass 2.7 ghz without a huge unstable liquid cooling system."
I think we'll see more cores per cpu before we see 3GHz. IMHO, 4,8 or more cores at 2.66 is far better than 1 or 2 cores at 3GHz.
""Steve Jobs really must have been embarassed after claiming we'd have 3 ghz when we still can't even pass 2.7 ghz without a huge unstable liquid cooling system."
IBM never produced chips that could run at 2.7GHz. In IBM was stuck at 2.2GHz instead of the 3GHz promised. Apple requested that chips be overclocked to 2.5GHz. In IBM was stuck at 2.3 GHZ, these chips were also overclocked to 2.7GHz. This year we are at Dual Cores 2.5Ghz. Even if Apple uses nothing but 2.66 GHz Dual cores, they will still be the fastest, non-overcloked chips that Apple has ever used.
"IMHO, 4,8 or more cores at 2.66 is far better than 1 or 2 cores at 3GHz."
8 cores?! Wow, maybe one day! But 2 or more cores/CPU are only good if your app can use them. Most applications, and in fact many of Apple, do not use more than 2 cores/CPUS. The Quad core G5's are a good example how the 3rd and 4th core are 98% or the time unused. A Dual 3GHz to a user would be much more usuefull than an 8 core 2.5GHz!
P.S. The number ONE problem that Apple must address in their pro line is the lack of Hard Drive bays! We need at least 4 HD, please! An internal 10,000 RPM RAID array is music to teh ears of pro video and film users.
I think we'll see more cores per cpu before we see 3GHz. IMHO, 4,8 or more cores at 2.66 is far better than 1 or 2 cores at 3GHz.
""Steve Jobs really must have been embarassed after claiming we'd have 3 ghz when we still can't even pass 2.7 ghz without a huge unstable liquid cooling system."
IBM never produced chips that could run at 2.7GHz. In IBM was stuck at 2.2GHz instead of the 3GHz promised. Apple requested that chips be overclocked to 2.5GHz. In IBM was stuck at 2.3 GHZ, these chips were also overclocked to 2.7GHz. This year we are at Dual Cores 2.5Ghz. Even if Apple uses nothing but 2.66 GHz Dual cores, they will still be the fastest, non-overcloked chips that Apple has ever used.
"IMHO, 4,8 or more cores at 2.66 is far better than 1 or 2 cores at 3GHz."
8 cores?! Wow, maybe one day! But 2 or more cores/CPU are only good if your app can use them. Most applications, and in fact many of Apple, do not use more than 2 cores/CPUS. The Quad core G5's are a good example how the 3rd and 4th core are 98% or the time unused. A Dual 3GHz to a user would be much more usuefull than an 8 core 2.5GHz!
P.S. The number ONE problem that Apple must address in their pro line is the lack of Hard Drive bays! We need at least 4 HD, please! An internal 10,000 RPM RAID array is music to teh ears of pro video and film users.
Hugh
Mar 22, 09:33 PM
The U.N. Security Council perhaps, but not the entire assembly. It would have been interesting to open that issue up to debate and seen how all the members would have voted.
What I always wonder is what diplomatic efforts were used to pressure Qaddafi? There were no (as far as I know) threats of economic embargoes, freezing of assets, or other less violent methods to coerce Qaddafi. We didn't need to convince him to step dow. We simply needed to convince him that he needed to tone down, defend himself against the armed insurrection, but not cast a wider and violent campaign against innocent civilians.
I need a clearer demonstration that serious steps were taken before resorting to war. War should be used as the last resort and only when it's clear that all other options have failed.
prince felipe family. spains
Prince Felipe, Queen Sofia
Prince Felipe, Queen Sofia
prince felipe family.
Spain#39;s Prince Felipe is
Crown Prince Felipe
What I always wonder is what diplomatic efforts were used to pressure Qaddafi? There were no (as far as I know) threats of economic embargoes, freezing of assets, or other less violent methods to coerce Qaddafi. We didn't need to convince him to step dow. We simply needed to convince him that he needed to tone down, defend himself against the armed insurrection, but not cast a wider and violent campaign against innocent civilians.
I need a clearer demonstration that serious steps were taken before resorting to war. War should be used as the last resort and only when it's clear that all other options have failed.
xxBURT0Nxx
Apr 9, 09:45 AM
I don't think 2IS is getting that IF Intel allowed Nvidia to continue making sandy bridge chipsets, Nvidia could've easily integrated a 320m successor into the south bridge. This would give you the best of both worlds, the downclocked Low-voltage Intel HD graphics when on battery or basic surfing, or the 320m successor in the south bridge when playing games or aperture photo editing. All this WITHOUT raising the motherboard chip count that putting a separate discrete (on it's own, not integrated into the chipset like 320m) would entail.
I thought the 320m was also integrated? Wouldn't that mean that would be your only graphics card were nvidia allowed to add them to sandy bridge? I don't see why you would have integrated intel hd 3000 along with an integrated 320m (or successor).
I thought the 320m was also integrated? Wouldn't that mean that would be your only graphics card were nvidia allowed to add them to sandy bridge? I don't see why you would have integrated intel hd 3000 along with an integrated 320m (or successor).
lyngo
Apr 7, 10:19 PM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_6 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8E200 Safari/6533.18.5)
Wow. This is pretty huge.
Wow. This is pretty huge.
cyberbeats
Jul 21, 07:11 AM
hi,
i've just sold my dual g5 because
i plan to buy a new macpro in august.
But seems that it will be already obsolate after 3 months.
Please can you tell me if the socket of woodcrest
will make the macpro upgradable one day,
or these new type of processors need differet socket?
Thanks.
i've just sold my dual g5 because
i plan to buy a new macpro in august.
But seems that it will be already obsolate after 3 months.
Please can you tell me if the socket of woodcrest
will make the macpro upgradable one day,
or these new type of processors need differet socket?
Thanks.
Reach9
Apr 11, 01:33 PM
The iPhone 4 is still the best smartphone in the market, so not surprising.
As for people expecting a 4" screen on the next iPhone dream on. They are not going to make an iPhone with a bigger screen.
You're kidding right? iPhone 4 and iOS 4 are incredibly stale. Apple has realized this and hence strong rumors suggest a total revamped iOS 5. Anyway i don't agree with you, i don't think the iPhone 4 is the best smartphone in the market.
What is the best smartphone in the market? The major Android phones (Thunderbolt, EVO etc.)
I wouldn't put that much thought into the OP guys. No way Apple would not take advantage of the Holiday season. Do you think people will actually buy the over-a-year old iPhone 4?
Remember how many sources said that the iPad 2 wouldn't be released until September? Remember how many people said there won't be an iPhone 4, until Gizmodo leaked the 'prototype'?
We'll see about the iPhone 5 in WWDC.
If anything Apple could have kept their iPad 2 for a September launch, but Apple is actually losing big time in the smartphone market, imo.
If i don't see an iPhone 5 in WWDC, then i'll consider jumping ship.
Apple has never been one to react to competition in the recent years. They seem to do what they think is best and let others follow them.
I think they know that if they bring out the best one when it is released, they will sell as many as they can make for a long time.
Of course Apple reacts to competition, every company in a market economy does. Apple might not blatantly say "the competition has a faster processor, that's why we made the A4 chip" but a basic University Econ class will teach you that every company reacts to the competition. Apple is no different.
Even if they do what they think is best, then they're greatly failing.
As a smartphone it is the iPhone that is following the competition, such as the lack of a notification system.
As for people expecting a 4" screen on the next iPhone dream on. They are not going to make an iPhone with a bigger screen.
You're kidding right? iPhone 4 and iOS 4 are incredibly stale. Apple has realized this and hence strong rumors suggest a total revamped iOS 5. Anyway i don't agree with you, i don't think the iPhone 4 is the best smartphone in the market.
What is the best smartphone in the market? The major Android phones (Thunderbolt, EVO etc.)
I wouldn't put that much thought into the OP guys. No way Apple would not take advantage of the Holiday season. Do you think people will actually buy the over-a-year old iPhone 4?
Remember how many sources said that the iPad 2 wouldn't be released until September? Remember how many people said there won't be an iPhone 4, until Gizmodo leaked the 'prototype'?
We'll see about the iPhone 5 in WWDC.
If anything Apple could have kept their iPad 2 for a September launch, but Apple is actually losing big time in the smartphone market, imo.
If i don't see an iPhone 5 in WWDC, then i'll consider jumping ship.
Apple has never been one to react to competition in the recent years. They seem to do what they think is best and let others follow them.
I think they know that if they bring out the best one when it is released, they will sell as many as they can make for a long time.
Of course Apple reacts to competition, every company in a market economy does. Apple might not blatantly say "the competition has a faster processor, that's why we made the A4 chip" but a basic University Econ class will teach you that every company reacts to the competition. Apple is no different.
Even if they do what they think is best, then they're greatly failing.
As a smartphone it is the iPhone that is following the competition, such as the lack of a notification system.
AndroidfoLife
Apr 6, 04:42 PM
Upper Middle Class FTW!
Poor college student for the win.
I have to be a part time street pharmacist to pay for my tech additions
Poor college student for the win.
I have to be a part time street pharmacist to pay for my tech additions
shelterpaw
Aug 11, 04:04 PM
No, not EVERYONE. I own 4 cell phones. By your logic, I would be counted as 4 people.And you have all the personalities to go with them. :D
Jimmieboy
Aug 8, 01:45 AM
Yahoo! Leopard looks awesome! Time machine looks like a lifesaver for me and spaces makes life so much easier. THANKS STEVE!
milo
Sep 13, 07:05 AM
A bit pointless given that no software utilises the extra cores yet.
Not true, according to the article. They said it wasn't easy, but they were able to max out all 8 cores. You can see the Activity Monitor graph all filled up.
It would be nice if 10.5 would allow a more 'blind' method to utilize these cores, versus having programmers specificly program for multi-core. Now that would be extremely helpful and allow a more simultanous workflow.
That's how it is now, at least with multiple apps. I bet it's possible to program for an unspecified number of multiple cores, and there may be apps doing it already.
I was interested to see that they were unable to max out CPU utilization on all 8 cores in the system. I hope it's due to the software these days not being ready to fully utilize more than one or two cores and not due to OSX's ability to scale to larger core counts. Since that's obviously where we're heading. Does anyone know about the potential for scalability of OSX to large numbers of CPU's/cores? I know some *nix varieties and BSD varieties do this really well, but one wonders if they were thinking this far in the future when they developed OSX. It'll be interesting to see...
Read the article again, they WERE able to max them out, just not easily. Based on that, OSX seems to be able to scale already. Developers just need to start writing apps that are more MP friendly.
Not true, according to the article. They said it wasn't easy, but they were able to max out all 8 cores. You can see the Activity Monitor graph all filled up.
It would be nice if 10.5 would allow a more 'blind' method to utilize these cores, versus having programmers specificly program for multi-core. Now that would be extremely helpful and allow a more simultanous workflow.
That's how it is now, at least with multiple apps. I bet it's possible to program for an unspecified number of multiple cores, and there may be apps doing it already.
I was interested to see that they were unable to max out CPU utilization on all 8 cores in the system. I hope it's due to the software these days not being ready to fully utilize more than one or two cores and not due to OSX's ability to scale to larger core counts. Since that's obviously where we're heading. Does anyone know about the potential for scalability of OSX to large numbers of CPU's/cores? I know some *nix varieties and BSD varieties do this really well, but one wonders if they were thinking this far in the future when they developed OSX. It'll be interesting to see...
Read the article again, they WERE able to max them out, just not easily. Based on that, OSX seems to be able to scale already. Developers just need to start writing apps that are more MP friendly.
wmmk
Aug 17, 09:37 AM
Won't Adobe use Core Image when the Universal Binaries come out? If both Quads had the same high powered graphics card, the benchmarks may show them to be the same with Core Image tasks.
doubt it. because having core image would mean a totally seperate windows version. developing 2 totally different codebases would take forever.
doubt it. because having core image would mean a totally seperate windows version. developing 2 totally different codebases would take forever.
W. Ademczyk
Aug 27, 09:41 PM
IMO, I believe the new enclosure will basically add easier access to swappable HDD's like the MB. It doesn't seem appropriate for a lower end model computer to have a feature the professional level model should have. That's why you pay the big $. I think the enclosure will remain the same, but we'll see an update that will allow users to change out their hard drives if they choose.
Exactly, allowing the user to swap out components is definately a direction that Apple is taking, which is something that helps them stay competitive in the pc world. The Macbook, as we all know, utilizes a design that makes it easy to swap out ram and HDDs; and the Mac Pro is configured with snazzy slide-out trays so that virtually every piece of hardware can be swapped out easily. This is a feature that the new MBP case design had better incorporate.
In regard to the Ipod incentive, if Intel shipped Merom to manufacturers at the end of July, will announce it's release to the public on the 28th, and Apple's own shipment of Merom toting computers comes in on the 5th, I have a hard time understanding why they would wait 2-3 weeks to put these computers in the hands of the public when Dell, HP, and Lenovo will be updating their websites the second that the announcement is made. As far as I can tell, there were two reasons Apple started giving away free Nanos to college kids. First, they needed to clean out the inventory for the next Ipod line; and second, the back to school rush is the best time to increase the market share since college students probably make up the highest percentage of win to mac switchers. Since Merom reportedly costs Apple the same amount as Yonah, and MBP sales have been a little lackluster, it would make next to no sense for Apple to drop the Ipod rebate. We have to remember that the only reason Macintels were released with Yonah in the first place is that Apple wasn't able to pressure Intel into giving them Merom early(thus explaining the drop from 64bit processing to 32bit and then back up again 7 months later). If Apple wouldn't have released the Intel line when it did, they would have been stuck with a stale product line and, missing out on the back to school rush, wouldn't be enjoying their doubled market share.
I think it's fair to conclude that the 16th was chosen as the date for the Nano rebate not because the Merom will appear after that time, but because most back to school shopping will be done by then. It is in Apple's best interest to try to catch the tail end of the college shopping season with the MBP.
Exactly, allowing the user to swap out components is definately a direction that Apple is taking, which is something that helps them stay competitive in the pc world. The Macbook, as we all know, utilizes a design that makes it easy to swap out ram and HDDs; and the Mac Pro is configured with snazzy slide-out trays so that virtually every piece of hardware can be swapped out easily. This is a feature that the new MBP case design had better incorporate.
In regard to the Ipod incentive, if Intel shipped Merom to manufacturers at the end of July, will announce it's release to the public on the 28th, and Apple's own shipment of Merom toting computers comes in on the 5th, I have a hard time understanding why they would wait 2-3 weeks to put these computers in the hands of the public when Dell, HP, and Lenovo will be updating their websites the second that the announcement is made. As far as I can tell, there were two reasons Apple started giving away free Nanos to college kids. First, they needed to clean out the inventory for the next Ipod line; and second, the back to school rush is the best time to increase the market share since college students probably make up the highest percentage of win to mac switchers. Since Merom reportedly costs Apple the same amount as Yonah, and MBP sales have been a little lackluster, it would make next to no sense for Apple to drop the Ipod rebate. We have to remember that the only reason Macintels were released with Yonah in the first place is that Apple wasn't able to pressure Intel into giving them Merom early(thus explaining the drop from 64bit processing to 32bit and then back up again 7 months later). If Apple wouldn't have released the Intel line when it did, they would have been stuck with a stale product line and, missing out on the back to school rush, wouldn't be enjoying their doubled market share.
I think it's fair to conclude that the 16th was chosen as the date for the Nano rebate not because the Merom will appear after that time, but because most back to school shopping will be done by then. It is in Apple's best interest to try to catch the tail end of the college shopping season with the MBP.
Endow
Aug 27, 03:35 PM
Can someone tell me in what Santa Rosa is all about and how much of a difference it is (as far as Merom is concerned) ??:)