vincenz
Apr 15, 11:05 AM
Personally, I think it's great. However, they should be careful. Moves like this have the potential to alienate customers. That said, props to the employees.
Alienate? How so?
I like the name of the project. It's very optimistic.
Alienate? How so?
I like the name of the project. It's very optimistic.
ShnikeJSB
Oct 26, 05:16 PM
My question is: if desktops are ramping up their cores so quickly with quad-core and dual quad-core processors, why are we to be stuck at "only" dual-core for notebooks for so long? As far as I have seen from my own "research" is that notebooks will be stuck at dual-core until at least Nehalem (45nm - 2009), and more likely Gesher (32nm - 2011), but certainly not Penryn (45nm - 2007). What gives??? Hell, at around the same time that Gesher arrives, Intel's Kiefer is supposed to be 32-Cores!
I know, heat and power, blah blah blah. But are laptops really going to be left THAT far behind?
I know, heat and power, blah blah blah. But are laptops really going to be left THAT far behind?
ABernardoJr
Apr 20, 09:37 PM
Is that a prerequisite? I have Apple battery charger.
lol It is not a prerequisite, but it might become a bit problematic when assumptions like these are made:
I don't. I just don't have OS/X. I just assumed that OS/X might not have it since some OS/X users here were confused about Windows hiding system files. :)
I'm not saying the assumption was true or false but assumptions on things that can be clarified by having the product certainly make it seem that it might help lol
lol It is not a prerequisite, but it might become a bit problematic when assumptions like these are made:
I don't. I just don't have OS/X. I just assumed that OS/X might not have it since some OS/X users here were confused about Windows hiding system files. :)
I'm not saying the assumption was true or false but assumptions on things that can be clarified by having the product certainly make it seem that it might help lol
topgunn
Aug 29, 11:24 AM
Would you be more or less likely to believe this report if it was released by the EPA?
Bill McEnaney
Mar 27, 11:37 PM
Spitzer says it's very rare and FOF are misquoting him and missusing his study.
FreeState, have you read the note I posted a link to the same video you posted, the one about what Spitzer says about Focus on the Family? I don't know why FOF neglected to mention how rarely sexual-orientation changes. But I think Dobson's organization should have mentioned that rarity.
FreeState, have you read the note I posted a link to the same video you posted, the one about what Spitzer says about Focus on the Family? I don't know why FOF neglected to mention how rarely sexual-orientation changes. But I think Dobson's organization should have mentioned that rarity.
acslater017
Apr 15, 10:50 AM
I have a couple problems with this approach. There's so much attention brought to this issue of specifically gay bullying that it's hard to see this outside of the framework of identity politics.
Where's the videos and support for fat kids being bullied? Aren't they suicidal, too, or are we saying here that gays have a particular emotional defect and weakness? They're not strong enough to tough this out? Is that the image the gay community wants to promote?
Man, being a fat kid in high school. That was rough. There were a number of cool, popular gay guys in my school. I'm sure they took some crap from some people, but oh how I would have rather been one of them! But hey, I'm still here, I'm still alive.
Bullying is a universal problem that affects just about anyone with some kind of difference others choose to pick on. It seems like everyone is just ignoring all that for this hip, trendy cause.
There's nothing wrong with focusing on a particular issue. The Japan tsunami is not the only suffering going on in the world, but people raise money and raise awareness about it cuz it wouldn't make sense to rally around "fix everything".
Where's the videos and support for fat kids being bullied? Aren't they suicidal, too, or are we saying here that gays have a particular emotional defect and weakness? They're not strong enough to tough this out? Is that the image the gay community wants to promote?
Man, being a fat kid in high school. That was rough. There were a number of cool, popular gay guys in my school. I'm sure they took some crap from some people, but oh how I would have rather been one of them! But hey, I'm still here, I'm still alive.
Bullying is a universal problem that affects just about anyone with some kind of difference others choose to pick on. It seems like everyone is just ignoring all that for this hip, trendy cause.
There's nothing wrong with focusing on a particular issue. The Japan tsunami is not the only suffering going on in the world, but people raise money and raise awareness about it cuz it wouldn't make sense to rally around "fix everything".
grooveattack
Apr 13, 02:40 AM
Update: An Apple rep told LoopInsight to stay tuned for news on the rest of the suite:
"Today was just a sneak peak of Final Cut Pro, stay tuned"
Motion and colour should come soon
On FCPX
OH GOD IT LOOKS KINDA LIKE IMOVIE AND IT'S UNDER $1000! clearly not for the pros and now no one can edit on this
*sarcasm*
It has a tidy ui, fully 64bit, it's ganna use all 8 of my cores, can still do exactly what current FCP can do just easier.
Looking forward to it.
I think they will still have the full studio boxed in store, I don't fancy downloading 6 DVDs worth of FCS from the app store, although it would make updates very easy.
"Today was just a sneak peak of Final Cut Pro, stay tuned"
Motion and colour should come soon
On FCPX
OH GOD IT LOOKS KINDA LIKE IMOVIE AND IT'S UNDER $1000! clearly not for the pros and now no one can edit on this
*sarcasm*
It has a tidy ui, fully 64bit, it's ganna use all 8 of my cores, can still do exactly what current FCP can do just easier.
Looking forward to it.
I think they will still have the full studio boxed in store, I don't fancy downloading 6 DVDs worth of FCS from the app store, although it would make updates very easy.
takao
Mar 14, 12:21 PM
At the risk of bumping this up to PRSI, let me just say that I thought 'saving face' was a thing of the past.
in japan though it's a little bit different. thats why there also isn't much open panic: simply for the fact that the majority of japanese don't want to be seen 'losing it'
off topic side note: for other nuclear plant designs this events could have been massivle more dramatic: like for certain swiss/german/european power plants where if one reactors cooling fails, the emergency generators are actually to be powered by the _other_ nuclear reactors on site ...
leaving the nuclear situation discussion aside for now: interestingly even a town which actually had very expensive tsunami protection wall was hit since it simply wasn't nowhere high enough
the most important point now will be to get the infrastracture running again because those fuel/electricity/food shortages are now turning to be really problematic
in japan though it's a little bit different. thats why there also isn't much open panic: simply for the fact that the majority of japanese don't want to be seen 'losing it'
off topic side note: for other nuclear plant designs this events could have been massivle more dramatic: like for certain swiss/german/european power plants where if one reactors cooling fails, the emergency generators are actually to be powered by the _other_ nuclear reactors on site ...
leaving the nuclear situation discussion aside for now: interestingly even a town which actually had very expensive tsunami protection wall was hit since it simply wasn't nowhere high enough
the most important point now will be to get the infrastracture running again because those fuel/electricity/food shortages are now turning to be really problematic
ezekielrage_99
Sep 26, 12:34 AM
Until they get the 45nm process up and going, I think this is going to be the top of the line. 4 cores topping out around the mid 2GHz range.
I wonder if this is Intel's long term strategy-- keep the cores relatively untouched, but double the number with each process step. That'll be entertaining for a generation or so, but they're going to have to come up with something else.
Sounds like both Intel and AMD are going by the philosophy more cores more speed.
It looks like the programmers will be in for a fun old time.
I wonder if this is Intel's long term strategy-- keep the cores relatively untouched, but double the number with each process step. That'll be entertaining for a generation or so, but they're going to have to come up with something else.
Sounds like both Intel and AMD are going by the philosophy more cores more speed.
It looks like the programmers will be in for a fun old time.
ffakr
Oct 6, 12:00 AM
I must love punishment because I scanned this whole tread. We need some sort system to gather the correct info into one location. :-)
Multimedia, you're so far out of mainstream that your comments make no sense to all but .01 % of computer users.
Seriously.. Most people don't rip 4 videos to h264 while they are creating 4 disk images and browsing the web.
I work at a wealthy research university, I set up a new mac every week (and too many PCs). A 1st Gen dual 2.0 G5 is plenty fast for nearly all users. I'm still surprised how nice ours runs considering it's 3 years old. In my experience the dual cores are more responsive (UI latency) but a slightly faster dual proc will run intensive tasks faster.
The reality is, a dual core system.. any current dual core system.. is a fantastic machine for 95% of computer users. The Core2 Duo (Merom) iMacs are extermely fast. The 24" iMac with 2GB ram runs nearly everything instantaneously.
The dual dual-core systems are rediculously fast. Iv'e set up several 2.66GHz models and I had to invent tasks to slow the thing down. Ripping DVD to h264 does take some time with handbrake (half playback speed ((that's ripping 1hour of DVD in 30 minutes) but the machine is still very responsive while you're doing that, installing software, and having Mathematica calculate Pi to 100,000 places. During normal use (Office, web, mail, chats...) it's unusual to see any of the cpu cores bump up past 20%.
I'm sure Apple will have 4 core cpus eventually but I don't expect it will happen immediately. Maybe they'll have one top end version but it'd certainly be a mistake to move the line to all quad cores.
Here's the reality...
- fewer cores running faster will be much better for most people
- there are relatively few tasks that really lend themselves to massively parallelizaton well. Video and Image editing are obvious because there are a number of ways to slice jobs up (render multiple frames.. break images into sections, modify in parallel, reassemble...).
- though multimedia is an Apple core market.. not everyone runs a full video shop or rending farm off of one desktop computer. Seriously guys, we don't.
- Games are especially difficult to thread for SMP systems. Even games that do support SMP like Quake and UT do it fairly poorly. UT only splits off audio work on to the 2nd cpu. The real time nature of games means you can't have 7 or 8 independent threads on an 8 core systems without running into issues were the game hangs up on a lagging thread. They simply work better in a more serial paradigm.
- The first quad core chips will be much hotter than current Core2 chips. Most people.. even people who want the power of towers.. don't want a desktop machine that actually pulls 600W from the wall because of the two 120-130W cpus inside. also, goodby silent MacPros in this config.
- The systems will be far too I/O bound in an 8 core system. The memory system does have lots of bandwith but the benchmarks indicate it will be bus and memory constrained. It'll certainly be hard to feed data from the SATA drives unless you've got gobs of memory and your not working on large streams of data (like video).
http://www.tomshardware.com/2006/09/10/four_cores_on_the_rampage/
Finally, Apple's all about the perception. Apple has held back cpu releases because they wouldn't let a lower end cpu clock higher than a higher end chip. They did it with PPC 603&604 and I think they did it with G3 & G4.
It's against everything Apple's ever done to have 3.0 GHz dual dual-core towers in the mid range and 2.33GHz quad-core cpus in the high end.
I see some options here..
Maybe we'll get the dual 2.66 quad cores in one high end system. The price will go up.
Alternately.. this could finally be a rumored Mac Station.. or.. Apple has yet to announce a cluster node version of the intel XServe.
Geez.. almost forgot.
For most people... the Core2 desktop systems bench better than the 4core systems or even the dual Core2 Xeon systems because the DDR2 is lower latency than the FBDIMMs. To all the gamers.. you don't want slower clocked quad core chips.. not even on the desktop. You want a speed bump of the Core2 Duo.
Multimedia, you're so far out of mainstream that your comments make no sense to all but .01 % of computer users.
Seriously.. Most people don't rip 4 videos to h264 while they are creating 4 disk images and browsing the web.
I work at a wealthy research university, I set up a new mac every week (and too many PCs). A 1st Gen dual 2.0 G5 is plenty fast for nearly all users. I'm still surprised how nice ours runs considering it's 3 years old. In my experience the dual cores are more responsive (UI latency) but a slightly faster dual proc will run intensive tasks faster.
The reality is, a dual core system.. any current dual core system.. is a fantastic machine for 95% of computer users. The Core2 Duo (Merom) iMacs are extermely fast. The 24" iMac with 2GB ram runs nearly everything instantaneously.
The dual dual-core systems are rediculously fast. Iv'e set up several 2.66GHz models and I had to invent tasks to slow the thing down. Ripping DVD to h264 does take some time with handbrake (half playback speed ((that's ripping 1hour of DVD in 30 minutes) but the machine is still very responsive while you're doing that, installing software, and having Mathematica calculate Pi to 100,000 places. During normal use (Office, web, mail, chats...) it's unusual to see any of the cpu cores bump up past 20%.
I'm sure Apple will have 4 core cpus eventually but I don't expect it will happen immediately. Maybe they'll have one top end version but it'd certainly be a mistake to move the line to all quad cores.
Here's the reality...
- fewer cores running faster will be much better for most people
- there are relatively few tasks that really lend themselves to massively parallelizaton well. Video and Image editing are obvious because there are a number of ways to slice jobs up (render multiple frames.. break images into sections, modify in parallel, reassemble...).
- though multimedia is an Apple core market.. not everyone runs a full video shop or rending farm off of one desktop computer. Seriously guys, we don't.
- Games are especially difficult to thread for SMP systems. Even games that do support SMP like Quake and UT do it fairly poorly. UT only splits off audio work on to the 2nd cpu. The real time nature of games means you can't have 7 or 8 independent threads on an 8 core systems without running into issues were the game hangs up on a lagging thread. They simply work better in a more serial paradigm.
- The first quad core chips will be much hotter than current Core2 chips. Most people.. even people who want the power of towers.. don't want a desktop machine that actually pulls 600W from the wall because of the two 120-130W cpus inside. also, goodby silent MacPros in this config.
- The systems will be far too I/O bound in an 8 core system. The memory system does have lots of bandwith but the benchmarks indicate it will be bus and memory constrained. It'll certainly be hard to feed data from the SATA drives unless you've got gobs of memory and your not working on large streams of data (like video).
http://www.tomshardware.com/2006/09/10/four_cores_on_the_rampage/
Finally, Apple's all about the perception. Apple has held back cpu releases because they wouldn't let a lower end cpu clock higher than a higher end chip. They did it with PPC 603&604 and I think they did it with G3 & G4.
It's against everything Apple's ever done to have 3.0 GHz dual dual-core towers in the mid range and 2.33GHz quad-core cpus in the high end.
I see some options here..
Maybe we'll get the dual 2.66 quad cores in one high end system. The price will go up.
Alternately.. this could finally be a rumored Mac Station.. or.. Apple has yet to announce a cluster node version of the intel XServe.
Geez.. almost forgot.
For most people... the Core2 desktop systems bench better than the 4core systems or even the dual Core2 Xeon systems because the DDR2 is lower latency than the FBDIMMs. To all the gamers.. you don't want slower clocked quad core chips.. not even on the desktop. You want a speed bump of the Core2 Duo.
Howdr
Mar 18, 08:35 AM
OMG you still done get it:
Let's try explaining it this way...
When you subscribe to cable, you pick a package that provides you with the channels that you want. There are various packages, but ultimately it's all just video streaming over a cable (bits in this day and age, not analog)...
Based on yours and others arguements, why can't we all just pay for basic cable and get all 500+ channels plus the premium channels for free? Very simply, you're paying for a package with specific features....
No no, as long as you abide by the amount of data in the plan it should not matter how you use it.
You can't steal what you paid for, you buy 100 cable channels that is what you get and use
You buy 2gb and use 1gb you have used 1gb no matter if its on the phone or laptop. 1gb= 1gb
With your cellular service, you chose a package that meets your needs. You have 3 options for data plans at this point, well, 4 technically...
1) Your grandfathered unlimited plan
2) 250mb
3) Data Pro 2GB
4) Data Pro 2GB + Tethering 2GB for a total of 4GB....
Ok? the tethering give you 2gb for the money I see that and I have read the tethering and Data pro are added to total 4gb for the charge. So you and At&t prove my point thank you! Data=Data, they add it together and it is the same.
Tethering is not the same as using the data on your device, essentially tethering is using your phone as a modem. You data plan (which I'm assuming is either unlimited or 250mb) does not include the feature of using your phone as a modem, that's what the extra charge is for....
If you want to tether, you need to pay for the appropriate package. Just like if you want HBO, Showtime, or HDTV you need to pay for the appropriate cable package...
LOL no its the same use of Data as on the phone.
Tethering does not do something different to AT&t, its just using Data
you may not understand how Data is used from the source but I assure you there is no difference to AT&t when you tether and when you surf YOUTUBE on the phone.
To At&t Data=Data and its been their words not mine every time its printed by them.
So far I have not seen an argument that proves otherwise.:rolleyes:
Let's try explaining it this way...
When you subscribe to cable, you pick a package that provides you with the channels that you want. There are various packages, but ultimately it's all just video streaming over a cable (bits in this day and age, not analog)...
Based on yours and others arguements, why can't we all just pay for basic cable and get all 500+ channels plus the premium channels for free? Very simply, you're paying for a package with specific features....
No no, as long as you abide by the amount of data in the plan it should not matter how you use it.
You can't steal what you paid for, you buy 100 cable channels that is what you get and use
You buy 2gb and use 1gb you have used 1gb no matter if its on the phone or laptop. 1gb= 1gb
With your cellular service, you chose a package that meets your needs. You have 3 options for data plans at this point, well, 4 technically...
1) Your grandfathered unlimited plan
2) 250mb
3) Data Pro 2GB
4) Data Pro 2GB + Tethering 2GB for a total of 4GB....
Ok? the tethering give you 2gb for the money I see that and I have read the tethering and Data pro are added to total 4gb for the charge. So you and At&t prove my point thank you! Data=Data, they add it together and it is the same.
Tethering is not the same as using the data on your device, essentially tethering is using your phone as a modem. You data plan (which I'm assuming is either unlimited or 250mb) does not include the feature of using your phone as a modem, that's what the extra charge is for....
If you want to tether, you need to pay for the appropriate package. Just like if you want HBO, Showtime, or HDTV you need to pay for the appropriate cable package...
LOL no its the same use of Data as on the phone.
Tethering does not do something different to AT&t, its just using Data
you may not understand how Data is used from the source but I assure you there is no difference to AT&t when you tether and when you surf YOUTUBE on the phone.
To At&t Data=Data and its been their words not mine every time its printed by them.
So far I have not seen an argument that proves otherwise.:rolleyes:
manic
Jul 12, 09:05 AM
Okay, people are hyped about the 4 core xeon. But arent we overlooking something here? Arent server processors designed to do substantially different work than desktops? Whats the point in fitting a >1000 dollar processor into a machine that runs photoshop and see it slug away? Im not saying thats the case, but I think its a relevant point and would like to know if anyone knows the answer. If its slower at desktop tasks, than we will be seeing conroes in mac pros. If its faster, then theres a pretty good chance it will fit the highest end one.
now, unless the other chap who said "anything other than woodcrest would be absolutely insulting" knows wc is insanely faster at desktop tasks, I think hes just building some negative hype. Conroes are supposed to outperform by a wide margin everything weve seen so far. Its by no means insulting
now, unless the other chap who said "anything other than woodcrest would be absolutely insulting" knows wc is insanely faster at desktop tasks, I think hes just building some negative hype. Conroes are supposed to outperform by a wide margin everything weve seen so far. Its by no means insulting
jiggie2g
Jul 12, 05:15 PM
This thread is getting too funny. Apple has been so far behind on power these past few years and now we get the chance to use Conroe, and suddenly that's not good enough for the Mac snobs. Conroe is an extremely fast chip (especially compared to G5), so I don't get why some people think it's a bad choice for the pro-line up. Sure, it can't do smp, but not everyone needs or want to pay for quad processing.
So, aside from the ability to do multiple processing, what advantages does Woodcrest have that make it mandatory to go in the pro-line? How much "faster" is it going to be over the Conroe? It's my understanding that they are identical in that respect.
They are , you will not see any performance differences between Merom, Conroe and Woodcrest at equal clock speeds, unless u go SMP. They will all encode , render , transcode at the same pace. The FSB means nothis as it has yet to be saturated even a 667mhz. Tons of test and benchmarks at Xtremesystems done over the past few months have proven this.
Making the MAcPro line all Dual will be a Big Mistake and will backfire on Apple and force many pople to go right back to PC. I can Promise you , if u want a Woody in a MacPro be prepared to pay an entry fee of $2499 to join this exclusive club of idiots.
I remeber when my iMac G4 was starting to show it'sa age and when the time came to replace it , the minimum price for a real desktop Mac was (and still is) $1999 for a dual 2.0ghz G5. So what did i do , I said goodbye Apple and built a better machine for 1/2 the money. Till this day I have no regrets and would never go back unless i was in the market for a notebook then i'd get a macbook.
I still can't believe Apple still has the balls to charge $2000 for an outdated Desktop that gets Outperformed by an $800 PC. While still having a smaller hard drive , less ram , less usb ports , no card reader. Jobs believes you mac loyalist are stupid.
Careful. You can get banned for calling anyone here a naughty name. They will go whining to the moderators and a moderator who might not like you in the first place will lock you out of the process. So I don't disrespect anyone in writing here any more. Everyone here is beautiful and fun to be with. :)
Believe me Bro i've already been there.:D
So, aside from the ability to do multiple processing, what advantages does Woodcrest have that make it mandatory to go in the pro-line? How much "faster" is it going to be over the Conroe? It's my understanding that they are identical in that respect.
They are , you will not see any performance differences between Merom, Conroe and Woodcrest at equal clock speeds, unless u go SMP. They will all encode , render , transcode at the same pace. The FSB means nothis as it has yet to be saturated even a 667mhz. Tons of test and benchmarks at Xtremesystems done over the past few months have proven this.
Making the MAcPro line all Dual will be a Big Mistake and will backfire on Apple and force many pople to go right back to PC. I can Promise you , if u want a Woody in a MacPro be prepared to pay an entry fee of $2499 to join this exclusive club of idiots.
I remeber when my iMac G4 was starting to show it'sa age and when the time came to replace it , the minimum price for a real desktop Mac was (and still is) $1999 for a dual 2.0ghz G5. So what did i do , I said goodbye Apple and built a better machine for 1/2 the money. Till this day I have no regrets and would never go back unless i was in the market for a notebook then i'd get a macbook.
I still can't believe Apple still has the balls to charge $2000 for an outdated Desktop that gets Outperformed by an $800 PC. While still having a smaller hard drive , less ram , less usb ports , no card reader. Jobs believes you mac loyalist are stupid.
Careful. You can get banned for calling anyone here a naughty name. They will go whining to the moderators and a moderator who might not like you in the first place will lock you out of the process. So I don't disrespect anyone in writing here any more. Everyone here is beautiful and fun to be with. :)
Believe me Bro i've already been there.:D
NathanMuir
Apr 24, 11:49 AM
I figured I'd use this wonderful Easter Sunday (a day spent celebrating the beginning of Spring and absolutely nothing else), to pose a question that I have.... What's the deal with religious people? After many a spirited thread about religion, I still can't wrap my head around what keeps people in the faith nowadays. I'm not talking about those people in third world nations, who have lived their entire lives under religion and know of nothing else. I'm talking about your Americans (North and South), your Europeans, the people who have access to any information they want to get (and some they don't) who should know better by now. And yet, in thread after thread, these people still swear that their way is the only way. No matter what logic you use, they can twist the words from their holy books and change the meaning of things to, in their minds, completely back up their point of view. Is it stubbornness, the inability to admit that you were wrong about something so important for so long? Is it fear? If I admit this is BS, I go to hell? Simple ignorance? Please remember, I'm not talking about just believing in a higher power, I mean those who believe in religion, Jews, Christian, etc.
If you strike a bias and confrontational tone, you get one in return. ;)
And people wonder why PRSI conversations revolve in endless circles, rehashing the same tired subject matter...
If you strike a bias and confrontational tone, you get one in return. ;)
And people wonder why PRSI conversations revolve in endless circles, rehashing the same tired subject matter...
BJNY
Oct 4, 02:55 PM
Does anyone know how much power a Cloverton 2.33GHz will draw compared to the current Woodcrest 3GHz? I hope Apple's power supply is adequate for Cloverton, 4 SATA hard drives, 2 optical drives, and better PCIe graphics card.
okboy
Apr 8, 11:03 PM
Poaching endangered species is illegal.
It's a pretty clear sign that they will be getting into gaming in some way.
Sorry, did you miss the iPhone, iPod touch and iPad. Wake up!
http://www.forbes.com/2008/06/04/apple-nintendo-iphone-tech-wire-cx_bc_0605nintendo.html
It's a pretty clear sign that they will be getting into gaming in some way.
Sorry, did you miss the iPhone, iPod touch and iPad. Wake up!
http://www.forbes.com/2008/06/04/apple-nintendo-iphone-tech-wire-cx_bc_0605nintendo.html
solidus12
Dec 30, 07:18 AM
I think the realistic expectation is: "If Apple doesn't make any more changes to the iPhone for the next 10 years, there will be an Android phone to beat it by 2020!!"
I feel like the trend is going to stay the same as it was with the G1. They're like "ooo look at our neat new features!!" Unfortunately, the iPhone/iPod just got those features, only better, just before you launched.
The competition just can't stay ahead, and Apple is going to keep it that way.
Yeah I mean what with the iphones Bluetooth transfers, tethering, awesome camara, Flash support, excellent reception, fantastic battery life etc..
Yeah way ahead.
No.
The iphone is successful because of the user-experience; Its one a child can pick up and use, it is slick and fluent experience and its packaged in something attractive.
People see it and are drawn to it because of this, the other phones require time and effort to navigate between menus and options to figure out how to use it - The iphone is simple. Pick up and play.
It has pushed the boundaries on user-experience and how a phone should try and work Yes and that has been a very attractive feature because it does everything all other phones can do but presents it far better.
I feel like the trend is going to stay the same as it was with the G1. They're like "ooo look at our neat new features!!" Unfortunately, the iPhone/iPod just got those features, only better, just before you launched.
The competition just can't stay ahead, and Apple is going to keep it that way.
Yeah I mean what with the iphones Bluetooth transfers, tethering, awesome camara, Flash support, excellent reception, fantastic battery life etc..
Yeah way ahead.
No.
The iphone is successful because of the user-experience; Its one a child can pick up and use, it is slick and fluent experience and its packaged in something attractive.
People see it and are drawn to it because of this, the other phones require time and effort to navigate between menus and options to figure out how to use it - The iphone is simple. Pick up and play.
It has pushed the boundaries on user-experience and how a phone should try and work Yes and that has been a very attractive feature because it does everything all other phones can do but presents it far better.
snoopy07
May 5, 02:22 PM
The perfect solution would be for apple to give all US carriers the Iphone. Then we can go and pick the network that works best. People that like At&t stay with At&t, if you want Verizon or t-mobile then go, that way we all live happy. It�s your call Apple :apple: we customers deserve to choose our carrier for our iphone.
benpatient
May 2, 09:18 AM
As I understand it, Safari will open the zip file since it's a "safe" download. But that doesn't mean it'll execute the code within that zip file, so how is this malware executing without user permission?
malware doesn't execute without user permission.
it relies on tricking the user into giving it permission to run, striking at what is typically the weakest link in any computer's security: the user.
any argument that XX isn't a threat, because it requires users to take an action in order to be truly dangerous, is a flawed argument, because in general, users are stupid, or at the very least, careless.
malware doesn't execute without user permission.
it relies on tricking the user into giving it permission to run, striking at what is typically the weakest link in any computer's security: the user.
any argument that XX isn't a threat, because it requires users to take an action in order to be truly dangerous, is a flawed argument, because in general, users are stupid, or at the very least, careless.
MacQuest
Jul 12, 09:29 AM
Spooky - I predicted this. Me and everyone else except a couple naysayers. I only buy laptops though, so I'm not really the target market. But I think this will be on every graphic designers desk by Xmas. Go Apple and Intel!
Yup, I agree. companies need to expire their annual budget by Q4, so they're just lookin' for things to buy at that time. I saw it all the time at Xerox. The account rep's would scrape and scrounge for sales for the first 9 months, start getting easier sales in October and November [since it's Q4], and then they ould just sit back and wait for sales to come to them from customers that [i]had[/b] to buy things before the end of the year and spend their remaining allocated budget, otherwise their budget would get cut for the following year.
Maybe for Easter we'll get Adobe CS3 in a colorful egg or frilly basket. :rolleyes:
Adobe blows.:mad:
;)
Yup, I agree. companies need to expire their annual budget by Q4, so they're just lookin' for things to buy at that time. I saw it all the time at Xerox. The account rep's would scrape and scrounge for sales for the first 9 months, start getting easier sales in October and November [since it's Q4], and then they ould just sit back and wait for sales to come to them from customers that [i]had[/b] to buy things before the end of the year and spend their remaining allocated budget, otherwise their budget would get cut for the following year.
Maybe for Easter we'll get Adobe CS3 in a colorful egg or frilly basket. :rolleyes:
Adobe blows.:mad:
;)
RebootD
Apr 12, 11:38 PM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 3_0 like Mac OS X; en-us) AppleWebKit/528.18 (KHTML, like Gecko) Version/4.0 Mobile/7A341 Safari/528.16)
As a print designer who has slowly started moving into editing and animation it made sense for me to just pay more for the Master Collection and start using Premiere and AE.
That said I miss using FCP (I used it at a job a few years back) and at $299 I am happy to pick it up and combine it with AE.
As a print designer who has slowly started moving into editing and animation it made sense for me to just pay more for the Master Collection and start using Premiere and AE.
That said I miss using FCP (I used it at a job a few years back) and at $299 I am happy to pick it up and combine it with AE.
citizenzen
Mar 15, 11:24 PM
Have I defined "contain" to your satisfaction?
Not really.
Here. I'll provide an example of equally insightful commentary ...
One day, this will all be over.
Not really.
Here. I'll provide an example of equally insightful commentary ...
One day, this will all be over.
edifyingGerbil
Apr 24, 12:09 PM
Great, let's have a race to the bottom to see which faith is the more bigoted.
If you're being burnt at the stake, it doesn't make much difference whether that's because of a story someone made up 2000 years ago, or a story a priest made up today. Faith is still the excuse, and the result is the same.
I'm not trying to further some Christian agenda or proselytise. I'm saying these things because I would rather support Christianity/Judaism/Atheism/whatever than Islam.
These days you'd be hard pressed to find someone being charged in a Western democracy for blasphemy but it's an almost every day occurrence in the Muslim world. The only time it happens in the West is when someone insults Islam, then it's classed as hate speech.
If you're being burnt at the stake, it doesn't make much difference whether that's because of a story someone made up 2000 years ago, or a story a priest made up today. Faith is still the excuse, and the result is the same.
I'm not trying to further some Christian agenda or proselytise. I'm saying these things because I would rather support Christianity/Judaism/Atheism/whatever than Islam.
These days you'd be hard pressed to find someone being charged in a Western democracy for blasphemy but it's an almost every day occurrence in the Muslim world. The only time it happens in the West is when someone insults Islam, then it's classed as hate speech.
JasperJanssen
Apr 30, 02:52 AM
Surprise. The major enterprise players take the top three spots.
Since when is Acer an enterprise player and Lenovo not?
Since when is Acer an enterprise player and Lenovo not?