flopticalcube
Apr 24, 12:40 PM
There are hells (known as "naraga") in Hinduism and Buddhism too, but none of them are eternal and all of them are only for people who have done really bad things in life - regardless of faith or lack thereof.
Christian believers who are enslaved by their fear of hell, as opposed to having their faith based on genuine love to God, will allegedly end up in hell anyway.
I was always under the impression that reincarnation was considered a kind of living hell, like reliving Junior High School over and over again.
The fire and brimstone of hell certainly figures in a lot of the fundamentalist sects of Christianity and many of the Protestant ones too. My father-in-law is a presbyterian lay preacher and constantly prattled on about it.
Christian believers who are enslaved by their fear of hell, as opposed to having their faith based on genuine love to God, will allegedly end up in hell anyway.
I was always under the impression that reincarnation was considered a kind of living hell, like reliving Junior High School over and over again.
The fire and brimstone of hell certainly figures in a lot of the fundamentalist sects of Christianity and many of the Protestant ones too. My father-in-law is a presbyterian lay preacher and constantly prattled on about it.
rasmasyean
Mar 11, 10:17 PM
Wikipedia seems to be kept up to date. If you have something new, maybe you guys can add it to this...if someone didn't beat you to it. ;)
http://en.wikipedia.org/wiki/2011_Sendai_earthquake_and_tsunami
http://en.wikipedia.org/wiki/2011_Sendai_earthquake_and_tsunami
Multimedia
Sep 26, 10:43 AM
http://www.anandtech.com/storage/showdoc.aspx?i=2480
I know they're making a PCI Express, DDR2, SATA II version though. Old news to me...Thanks but that looks like it's only of PCs. Do you know it works in Mac G5 Quads and Mac Pros?
I went to the GIGA-BYTE TECHNOLOGY CO website and it looks like they don't even make that i-RAM card any more. The link to the above article is from July 25, 2005 more than a year ago.
I know they're making a PCI Express, DDR2, SATA II version though. Old news to me...Thanks but that looks like it's only of PCs. Do you know it works in Mac G5 Quads and Mac Pros?
I went to the GIGA-BYTE TECHNOLOGY CO website and it looks like they don't even make that i-RAM card any more. The link to the above article is from July 25, 2005 more than a year ago.
nagromme
Oct 7, 02:12 PM
I think point 3 is the biggest problem with the iPhone OS and will be what in the long run what will let others over take it.
Valid points, except you're looking at a micro-niche of power-users, while the iPhone's massive growth comes from a much broader market than that. Android will (and does) take some power-user market share, and I look forward to seeing where it goes.
The big thing though is DEVELOPER share. Apps. Android will run--in different flavors--on a number of different phones, offering choice in screen size, features, hard vs. virtual keys, etc. That sounds great--but will the same APP run on all those flavors? No. The app market will be fragmented among incompatible models. There's no good way out of that--it's one advantage Apple's model will hang on to.
Valid points, except you're looking at a micro-niche of power-users, while the iPhone's massive growth comes from a much broader market than that. Android will (and does) take some power-user market share, and I look forward to seeing where it goes.
The big thing though is DEVELOPER share. Apps. Android will run--in different flavors--on a number of different phones, offering choice in screen size, features, hard vs. virtual keys, etc. That sounds great--but will the same APP run on all those flavors? No. The app market will be fragmented among incompatible models. There's no good way out of that--it's one advantage Apple's model will hang on to.
ubersoldat
Jun 5, 05:08 PM
Not sure this is a good test...
I'm beginning to see that while ATT is the bigger culprit, the iphone itself may play a role in what happens with dropped calls...
My service (as is well documented in these forums) at home was/is terrible.
I recently purchased the microcell, from ATT, and I can now make calls in my house!! Except, when I move exactly 20 feet away from the microcell into my kitchen, my iPhone struggles with itself to pick up the 2 bar distant tower that was the guilty party in dropping my calls... so now, in my house iPhone juggles between a 5 bar microcell and a 1-2 bar tower (which still drops calls). It also drops every call that I'm on if i leave my house during a call, or arrive at my house during a call.
it's absolutely ridiculous that you have to buy a microcell (at&t should provide you one free of charge) to get 5 bars. the technology is there as here in germany we have 5 bars (2G and 3G) without issues even in buildings with tons of armored concrete...
I'm beginning to see that while ATT is the bigger culprit, the iphone itself may play a role in what happens with dropped calls...
My service (as is well documented in these forums) at home was/is terrible.
I recently purchased the microcell, from ATT, and I can now make calls in my house!! Except, when I move exactly 20 feet away from the microcell into my kitchen, my iPhone struggles with itself to pick up the 2 bar distant tower that was the guilty party in dropping my calls... so now, in my house iPhone juggles between a 5 bar microcell and a 1-2 bar tower (which still drops calls). It also drops every call that I'm on if i leave my house during a call, or arrive at my house during a call.
it's absolutely ridiculous that you have to buy a microcell (at&t should provide you one free of charge) to get 5 bars. the technology is there as here in germany we have 5 bars (2G and 3G) without issues even in buildings with tons of armored concrete...
kultschar
Apr 9, 05:00 AM
Not been impressed with control system for certain games on ios however Dead Space on the iPad 2 impresses me graphics wise and a step in the right direction control wise but still a little clunky.
Surely a matter of time before we will start playing apps on our Apple TVs with a special controller of some sort!
Surely a matter of time before we will start playing apps on our Apple TVs with a special controller of some sort!
r1ch4rd
Apr 22, 09:57 PM
And if over two thousand years from now people still believe in the Higgs Boson despite no evidence that it exists I'd likely be skeptical of their beliefs as well.
Hopefully we will find the answer soon enough because there are scientists working on both sides to prove and disprove the higgs boson and once we have it agreed one way or the other, we won't have many scientists preaching that you should have blind faith alone. The higgs boson is not going to be testing our loyalty!
The key thing for me that gives science credibility over religion is the ability to go back and revise your "beliefs" based on more recent findings or new understanding.
Hopefully we will find the answer soon enough because there are scientists working on both sides to prove and disprove the higgs boson and once we have it agreed one way or the other, we won't have many scientists preaching that you should have blind faith alone. The higgs boson is not going to be testing our loyalty!
The key thing for me that gives science credibility over religion is the ability to go back and revise your "beliefs" based on more recent findings or new understanding.
prady16
Aug 29, 10:55 AM
I sure do appreciate Dell for this!
If only they had making better quality products as their first priority!
If only they had making better quality products as their first priority!
jegbook
Apr 12, 03:30 PM
What if I just want my top 10 favorites? In Windows I just drag the icon (of whatever I want) to the Start button, then drop it into the list of my favorites (I'm not sure of the actual term for this). Can this be done on a Mac?
Since I open the same 10 or 12 programs or folders or files many times throughout the day, every day, this is pretty important to me. It would absolutely mess up my work flow to lose this feature.
If this already got covered, I apologize.
Sounds like a job for the Dock. The default mode of the Windows 7 Taskbar is very Dock-like. They both generally seem like a handy place to keep your most commonly used applications.
(I Win 7, you Pin to the Taskbar with the default behavior, which turns the whole Taskbar into a Quicklaunch area. Though it is possible to revert to XP-like behavior with a Quicklaunch and worded application references to the right of the Quicklaunch.)
I don't use the right side of the Dock in anything but "Folder" and "List" view. I still miss how Tiger (OS 10.4.x) treated Aliases (shortcuts) of folders: you could see the actual contents of the folder you aliased. Since Leopard, it just allows you to open the folder in a new Finder window. Poo. I created folders with aliases to all of my applications as I've categorized them for years.
(For the record, aliases and shortcuts are similar, but not the same. Worth googling to confirm the subtle differences.)
Strict keyboard navigation is tougher. If you like it, be sure to turn on Full Keyboard access for All Controls in the Keyboard Shortcuts section of the Keyboard Preference Pane.
I miss the split window of Windows Explorer: Folder List on the left, contents on the right. I use Column View most of the time for Finder Windows (Command-3) and sometimes List View (Command-2) if I'm specifically interested in file/folder details. I don't think there are any third party navigation tools that replicate that, either.
If your're getting a laptop, the trackpad is awesome. Nothing like it in Windows that I'm aware of.
I think Control Panels are easier and more straightforward in OS X, called System Preferences with Preference Panes. I think Control Panels got even more convoluted with Vista/Win7 from XP. That said, the Windows gives much more granularity of control than OS X, but many things can be modified with some third party help (you HAVE to check out Tinker Tool).
Is it worth it? Hard to say. If you spend most of your computing in an office with Windows computers in a Windows domain? I say not worth switching. You *can* do everything, but I find it often a little more time consuming than I find it in Windows.
If most of your computing is for personal use and/or you're not integrating into a Windows domain environment? Then I'd say whatever software you need to run and personal preference can drive the decision.
Good luck!
Since I open the same 10 or 12 programs or folders or files many times throughout the day, every day, this is pretty important to me. It would absolutely mess up my work flow to lose this feature.
If this already got covered, I apologize.
Sounds like a job for the Dock. The default mode of the Windows 7 Taskbar is very Dock-like. They both generally seem like a handy place to keep your most commonly used applications.
(I Win 7, you Pin to the Taskbar with the default behavior, which turns the whole Taskbar into a Quicklaunch area. Though it is possible to revert to XP-like behavior with a Quicklaunch and worded application references to the right of the Quicklaunch.)
I don't use the right side of the Dock in anything but "Folder" and "List" view. I still miss how Tiger (OS 10.4.x) treated Aliases (shortcuts) of folders: you could see the actual contents of the folder you aliased. Since Leopard, it just allows you to open the folder in a new Finder window. Poo. I created folders with aliases to all of my applications as I've categorized them for years.
(For the record, aliases and shortcuts are similar, but not the same. Worth googling to confirm the subtle differences.)
Strict keyboard navigation is tougher. If you like it, be sure to turn on Full Keyboard access for All Controls in the Keyboard Shortcuts section of the Keyboard Preference Pane.
I miss the split window of Windows Explorer: Folder List on the left, contents on the right. I use Column View most of the time for Finder Windows (Command-3) and sometimes List View (Command-2) if I'm specifically interested in file/folder details. I don't think there are any third party navigation tools that replicate that, either.
If your're getting a laptop, the trackpad is awesome. Nothing like it in Windows that I'm aware of.
I think Control Panels are easier and more straightforward in OS X, called System Preferences with Preference Panes. I think Control Panels got even more convoluted with Vista/Win7 from XP. That said, the Windows gives much more granularity of control than OS X, but many things can be modified with some third party help (you HAVE to check out Tinker Tool).
Is it worth it? Hard to say. If you spend most of your computing in an office with Windows computers in a Windows domain? I say not worth switching. You *can* do everything, but I find it often a little more time consuming than I find it in Windows.
If most of your computing is for personal use and/or you're not integrating into a Windows domain environment? Then I'd say whatever software you need to run and personal preference can drive the decision.
Good luck!
CaoCao
Mar 26, 01:32 AM
I'm sorry, but did you really just say that relationships built on love are not stable? REALLY? Because I was always told that love conquers all. And I do believe that, because it does.
Love in it's purest form is what makes humans great. You don't even know what that word means.
Love conquers all until it hits a rough patch
au revoir
Love in it's purest form is what makes humans great. You don't even know what that word means.
Love conquers all until it hits a rough patch
au revoir
OptyCT
Apr 20, 06:48 PM
Please explain to me how I am experiencing a "degraded" experience on my current Android phone?
I can do everything your iPhone can, plus tether at no additional cost and download any song I want for free.
Ease of use in Android is just as simple as an iPhone, with the ability to customize IF YOU SO PLEASE.
So if you would, cut the degraded experience crap.
I'm an avid Mac and iPad user, but I also own and use a Droid Incredible. A couple of months ago, I just about had it with the phone. Battery life was poor, frequent reboots, etc. So, I decided to root the phone. After rooting, it was an entirely new experience. All of my issues with the Incredible were resolved. Battery life was much improved, UI was a lot smoother and well thought out, etc. However, the constant annoyance with Android was still there...the Android Market. The quality of apps on the Android market, when compared to the App Store, are very low. It reminds me of the App Store from four years ago. On top of that, I'm paranoid to download any app that isn't made by a well-known developer.
In response to the previous post that touted the ability to tether and download music at no cost on a rooted Android, my Cyanogenmod Incredible can also do this. However, you'd have to be a fool to think that the wireless carriers are going to allow this to continue. There's already warnings from top root developers that the carriers are going to lock this down in the near future.
I can do everything your iPhone can, plus tether at no additional cost and download any song I want for free.
Ease of use in Android is just as simple as an iPhone, with the ability to customize IF YOU SO PLEASE.
So if you would, cut the degraded experience crap.
I'm an avid Mac and iPad user, but I also own and use a Droid Incredible. A couple of months ago, I just about had it with the phone. Battery life was poor, frequent reboots, etc. So, I decided to root the phone. After rooting, it was an entirely new experience. All of my issues with the Incredible were resolved. Battery life was much improved, UI was a lot smoother and well thought out, etc. However, the constant annoyance with Android was still there...the Android Market. The quality of apps on the Android market, when compared to the App Store, are very low. It reminds me of the App Store from four years ago. On top of that, I'm paranoid to download any app that isn't made by a well-known developer.
In response to the previous post that touted the ability to tether and download music at no cost on a rooted Android, my Cyanogenmod Incredible can also do this. However, you'd have to be a fool to think that the wireless carriers are going to allow this to continue. There's already warnings from top root developers that the carriers are going to lock this down in the near future.
archipellago
May 2, 04:34 PM
All successful malware includes privilege escalation via exploitation. This does not. That is why malware never has become successful in OS X and is becoming less successful in Windows. The big issue with Windows in the past was the default account in Windows XP (admin) runs with elevated privileges by default so privilege escalation was not required for system level access.
Man in the browser is now the biggest issue for all OS's, malware wise.
All the info. happens via the browser, there is no point attacking anything else.
Man in the browser is now the biggest issue for all OS's, malware wise.
All the info. happens via the browser, there is no point attacking anything else.
jiggie2g
Jul 12, 04:34 PM
In A Word NO. There is nothing complicated about understanding Intel's Processor line. Only lazy consumers unwilling to read anything.
Yes Mulitmedia these are the same morons with too much money and too little sense , These are the same people who are saying ..ohhh why can't Conroe go into an iMac , but i want a Woodcrest , hey I don't care if Merom is Pin compatible can't they go with Conroe for it's better perfromance ..lol
What a bunch of whiny daddy's boys , no sense at all they just obey the all mighty Stevie Jobs when he lies about how the new MacPro is THE FASTEST PEECEE IN THE WORRRRLD:p
Yes Mulitmedia these are the same morons with too much money and too little sense , These are the same people who are saying ..ohhh why can't Conroe go into an iMac , but i want a Woodcrest , hey I don't care if Merom is Pin compatible can't they go with Conroe for it's better perfromance ..lol
What a bunch of whiny daddy's boys , no sense at all they just obey the all mighty Stevie Jobs when he lies about how the new MacPro is THE FASTEST PEECEE IN THE WORRRRLD:p
jefhatfield
Oct 8, 12:12 PM
Originally posted by Backtothemac
These test that this guy puts up are crap! The Athlon is overclocked to be a 2100+, none of the systems have the most current OS. I personally have seen great variations in his tests over the years, and personally, I don't buy it. Why test for single processor functions? The Dual is a DUAL! All of the major Apps are dual aware, as is the OS!
Try that with XP Home.
i don't think there is an easy way to test a mac vs a pc for speed issues
but overall, i like barefeats and i think those tests give one a general idea of what a machine can do and are not specifically one hundred percent accurate all the time in the tests
sometimes magazine comparisons between two pc machines are not equally matched in terms of ram, video card, etc...
one thing is certain, the athlon is faster than the duron, the pentium 4 is faster than the celeron, and the G4 is faster (in photoshop) than the G3...but beyond that, it is hard to get a perfect reading
my overclocked 2 cents;)
These test that this guy puts up are crap! The Athlon is overclocked to be a 2100+, none of the systems have the most current OS. I personally have seen great variations in his tests over the years, and personally, I don't buy it. Why test for single processor functions? The Dual is a DUAL! All of the major Apps are dual aware, as is the OS!
Try that with XP Home.
i don't think there is an easy way to test a mac vs a pc for speed issues
but overall, i like barefeats and i think those tests give one a general idea of what a machine can do and are not specifically one hundred percent accurate all the time in the tests
sometimes magazine comparisons between two pc machines are not equally matched in terms of ram, video card, etc...
one thing is certain, the athlon is faster than the duron, the pentium 4 is faster than the celeron, and the G4 is faster (in photoshop) than the G3...but beyond that, it is hard to get a perfect reading
my overclocked 2 cents;)
mangrove
Sep 2, 08:21 PM
[QUOTE=mangrove;10977725]:D:D:D
The happiest dat of
Great! :) Hope you come back and let us know how the service is and how it compares to AT&T. Which phone did you get?
Since I have an iPad that is really all I need + Verizon. Everywhere I would go where people had no reception (me too with iPhone), I would ask what carrier they use-nearly 100% said AT&T. Then in those same instances/places I would ask people those who could talk freely on their phones what carrier they used and it was like 98 out of 100 said Verizon.
That's why I switched. Got a simple phone-Samsung Haven-2 phones for $60./month, but only 450 minutes (which I never exceeded with 2 iPhones) for around $165./month.
Sure hope the iPad is Verizon compatible soon too.
The upside to having 2 dead iPhones--now we have 2 wifi iPods so all the iPhone apps work on them.:D
The happiest dat of
Great! :) Hope you come back and let us know how the service is and how it compares to AT&T. Which phone did you get?
Since I have an iPad that is really all I need + Verizon. Everywhere I would go where people had no reception (me too with iPhone), I would ask what carrier they use-nearly 100% said AT&T. Then in those same instances/places I would ask people those who could talk freely on their phones what carrier they used and it was like 98 out of 100 said Verizon.
That's why I switched. Got a simple phone-Samsung Haven-2 phones for $60./month, but only 450 minutes (which I never exceeded with 2 iPhones) for around $165./month.
Sure hope the iPad is Verizon compatible soon too.
The upside to having 2 dead iPhones--now we have 2 wifi iPods so all the iPhone apps work on them.:D
javajedi
Oct 11, 08:48 AM
Originally posted by ddtlm
javajedi:
Admittedly I am getting lost in what all the numbers people have mentioned are for, but looking at these numbers you have here and assuming that they are doing the same task, you can rest assured that the G3/G4 are running far inferior software. AltiVec and SSE2 or not, there is just nothing that can explain this difference other than an unfair playing field. There is no task that a P4 can do 11x or 12x the speed of a G4 (comparing top-end models here). The P4 posseses nothing that runs at 11x or 12x the speed. Not the clock, not the units, the bandwidth to memory and caches are not 11x or 12x as good, it is not 11x better at branch prediction. I absolutely refuse to accept these results without very substantial backing because they contradict reality as I know it. I know a lot about the P4 and the G4, and I know a lot about programming in a fair number of different languages, even some assembly. I insist that these results do not reflect the actual performance of the processors, until irrefutable proof is presented to show how they do.
I guess the 70 and 90 don't surprise me a lot for the G3/G4, depending on clock speed difference. But all this trendy wandwagon-esque G4-bashing is not correct just cause every one else is doing it. There are things about the G3 that are very nice, but the G4 is no slouch compared to it, and given the higher clock that it's pipeline allows, the G3 really can't keep up. The G4 not only sports a better standard FPU, but it also sports better integer units.
Keep in mind this test does not reflect balanced system performance. The point of this exercise has been to determine how the G4's FPU compares to an assortment of different processors and operating systems.
I'd like to know you you qualify "inferior software" on the x86. If the P4 is some how cheating, then all of the other processors are cheating as well. Again, we ran the exact same code. We even made it into C code on the mac for maximum speed. In fact I'd like for you to check the code out for yourself, so you can see there is no misdirection here. Keep in mind, other people here have ran it on Athlons in Linux and still get sub 10 second times. I've also had a friend of mine (who i can trust) run it under Yellow Dog on a G4, he got 100+ seconds. And I did not tell him the scores we've been getting on the Mac, I had him run the test first and tell me how long it took before I even said anything. The JRE and now Mac OS X have been factored out of this equation.
When you look at operations like these, for example scalar integer ops, that's all register. The fsb, bsb, or anything else doesn't matter. This is a direct comparison between the two units on the G4 vs everything else. Also, my question to you is, in what way are the integer and fpu units "better" in the G4? I did not build the chip so I can't say weather they are better or not better than those in the 750FX, but I can say I've ran a fair benchmark comparing the FPU on the G4 from everything to a P4, Athlon, C3, G3, different operating systems, on x86 Windows and Linux, and on the Mac, Mac OS X and Yellow Dog. The results are consistent across the board. What more "proof" do you want?
javajedi:
Admittedly I am getting lost in what all the numbers people have mentioned are for, but looking at these numbers you have here and assuming that they are doing the same task, you can rest assured that the G3/G4 are running far inferior software. AltiVec and SSE2 or not, there is just nothing that can explain this difference other than an unfair playing field. There is no task that a P4 can do 11x or 12x the speed of a G4 (comparing top-end models here). The P4 posseses nothing that runs at 11x or 12x the speed. Not the clock, not the units, the bandwidth to memory and caches are not 11x or 12x as good, it is not 11x better at branch prediction. I absolutely refuse to accept these results without very substantial backing because they contradict reality as I know it. I know a lot about the P4 and the G4, and I know a lot about programming in a fair number of different languages, even some assembly. I insist that these results do not reflect the actual performance of the processors, until irrefutable proof is presented to show how they do.
I guess the 70 and 90 don't surprise me a lot for the G3/G4, depending on clock speed difference. But all this trendy wandwagon-esque G4-bashing is not correct just cause every one else is doing it. There are things about the G3 that are very nice, but the G4 is no slouch compared to it, and given the higher clock that it's pipeline allows, the G3 really can't keep up. The G4 not only sports a better standard FPU, but it also sports better integer units.
Keep in mind this test does not reflect balanced system performance. The point of this exercise has been to determine how the G4's FPU compares to an assortment of different processors and operating systems.
I'd like to know you you qualify "inferior software" on the x86. If the P4 is some how cheating, then all of the other processors are cheating as well. Again, we ran the exact same code. We even made it into C code on the mac for maximum speed. In fact I'd like for you to check the code out for yourself, so you can see there is no misdirection here. Keep in mind, other people here have ran it on Athlons in Linux and still get sub 10 second times. I've also had a friend of mine (who i can trust) run it under Yellow Dog on a G4, he got 100+ seconds. And I did not tell him the scores we've been getting on the Mac, I had him run the test first and tell me how long it took before I even said anything. The JRE and now Mac OS X have been factored out of this equation.
When you look at operations like these, for example scalar integer ops, that's all register. The fsb, bsb, or anything else doesn't matter. This is a direct comparison between the two units on the G4 vs everything else. Also, my question to you is, in what way are the integer and fpu units "better" in the G4? I did not build the chip so I can't say weather they are better or not better than those in the 750FX, but I can say I've ran a fair benchmark comparing the FPU on the G4 from everything to a P4, Athlon, C3, G3, different operating systems, on x86 Windows and Linux, and on the Mac, Mac OS X and Yellow Dog. The results are consistent across the board. What more "proof" do you want?
jwdsail
Sep 20, 11:42 AM
Apple iPod Video Express... (I'm hoping to kill the 'Chicken Little' iTV name will get Apple sued stuff)
A hard drive? Hard to believe, I'd think some flash memory as a buffer, maybe 4GB? Perhaps you can add a HD via the USB 2 port? Too small to have a 3.5" drive.. May be too small for a laptop drive.. A 1.8" drive would add too much to the cost, wouldn't it?
I think w/ the HDMI output, and the price, what we're staring at is really a wireless upscaler... Take any content from your Mac, and wirelessly upscale to the native res of your TV (up to 1080p)...
If this is the case, I may just buy one in place of the Mac mini (w/ something other than Intel Integrated *SPIT* Graphics BTO, that will more than likely never happen...) that I've wanted to add to my TV...
Shrug.
Just my $0.02US
jwd
A hard drive? Hard to believe, I'd think some flash memory as a buffer, maybe 4GB? Perhaps you can add a HD via the USB 2 port? Too small to have a 3.5" drive.. May be too small for a laptop drive.. A 1.8" drive would add too much to the cost, wouldn't it?
I think w/ the HDMI output, and the price, what we're staring at is really a wireless upscaler... Take any content from your Mac, and wirelessly upscale to the native res of your TV (up to 1080p)...
If this is the case, I may just buy one in place of the Mac mini (w/ something other than Intel Integrated *SPIT* Graphics BTO, that will more than likely never happen...) that I've wanted to add to my TV...
Shrug.
Just my $0.02US
jwd
Xtremehkr
Mar 18, 07:09 PM
I can understand why people would think this is a good thing. But at the same time, there are consequences for Apple. Apple sells others peoples songs, if they thought that selling to Apple would mean that their songs could be immediately transfered to P2P outlets they would be reluctant to supply ITMS with any songs.
When it comes to picking ITMS over a different outlet, the amount of available content has a lot to do with it.
I am really curious about who is behind this new group. This software is a poison pill for ITMS, as far as label owners are concerned.
Steve was just in the news for sending an email claiming that a rivals software was easily hacked in order to bypass the restrictions in place to prevent downloaders from freely sharing the music they have purchased.
This smacks of corporate skullduggery at its worst.
On a positive note, it proves that Apple is the company to go after in this area.
On the other hand, Apple needs to step up and protect its interests in this area.
For some reason, 'PlaysForSure' keeps coming to mind.
When it comes to picking ITMS over a different outlet, the amount of available content has a lot to do with it.
I am really curious about who is behind this new group. This software is a poison pill for ITMS, as far as label owners are concerned.
Steve was just in the news for sending an email claiming that a rivals software was easily hacked in order to bypass the restrictions in place to prevent downloaders from freely sharing the music they have purchased.
This smacks of corporate skullduggery at its worst.
On a positive note, it proves that Apple is the company to go after in this area.
On the other hand, Apple needs to step up and protect its interests in this area.
For some reason, 'PlaysForSure' keeps coming to mind.
awmazz
Mar 12, 03:29 AM
Of course as with all nuclear disasters there's the usual 'don't worry, it's not that bad' while at the same time they evacuate 45,000 people from the immediate surrounds..
Analysts say a meltdown would not necessarily lead to a major disaster because light-water reactors would not explode even if they overheated.
Well, that map seems to show Japan itself will be okay from the fallout at least.
EDIT- They've extended the evacuation radius around the #2 plant to 10km, the same as the #1 plant. The #1 plant is the one which had the explosion.
Analysts say a meltdown would not necessarily lead to a major disaster because light-water reactors would not explode even if they overheated.
Well, that map seems to show Japan itself will be okay from the fallout at least.
EDIT- They've extended the evacuation radius around the #2 plant to 10km, the same as the #1 plant. The #1 plant is the one which had the explosion.
brianus
Sep 26, 02:19 PM
If what you say is true, then yes that would be IT. Why won't Tigerton go in Summer '07 Mac Pros?
This was epitaphic's explanation:
Intel has two lines of Xeon processors:
* The 5000 series is DP (dual processor, like Woodcrest, Clovertown)
* The 7000 series MP (multi processor - eg 4+ processors)
Tigerton is supposed to be an MP version of Clovertown. Meaning, you can have as many chips as the motherboard supports, and just like Clovertown its an MCM (two processors in one package). 7000's are also about 5-10x the price of 5000's.
So unless the specs for Tigerton severely change, no point even considering it on a Mac Pro (high end xserve is plausible).
(gotta love that arbitrary terminology, huh? -- 2 processors apparently isn't "multiple").
This was epitaphic's explanation:
Intel has two lines of Xeon processors:
* The 5000 series is DP (dual processor, like Woodcrest, Clovertown)
* The 7000 series MP (multi processor - eg 4+ processors)
Tigerton is supposed to be an MP version of Clovertown. Meaning, you can have as many chips as the motherboard supports, and just like Clovertown its an MCM (two processors in one package). 7000's are also about 5-10x the price of 5000's.
So unless the specs for Tigerton severely change, no point even considering it on a Mac Pro (high end xserve is plausible).
(gotta love that arbitrary terminology, huh? -- 2 processors apparently isn't "multiple").
superfula
Apr 11, 04:02 PM
seriously, stop spreading crap like this. You make it plainly obvious that you have never actually used a mac. Or, that you're a 20-something kid who values your precious soul-sucking video games above all else.
Aside from the part about installing Mac OS on the pc, which isn't THAT far off if you have the right hardware, nothing else that he said is really that inaccurate.
I'm sorry if YOU can't see any value in a mac - you aren't looking very hard. Try loading OSX on your pc. Go ahead. I'll wait. Oh, make sure it is full functionality too. I want gestures, I want full printing and network support, everything. You say you have it? Prove it. Give me screen shots, video with audio, etc.
Did you not read the thread title? The op was specifically asking for people's opinions and what they don't like. And that's exactly what he stated.
I'm sorry, but I loathe posts like yours. If you are so anti-mac, then good for you. Enjoy your world, but stay the hell out of ours.
Good grief, he didn't attack your mom. Your statement here, and really the entire post is uncalled for. He is well within the subject of the thread. If you don't believe so, report him and move on. If you don't like his reasoning, perhaps you are far to pro-Mac to be able to know the difference. Chill.
Aside from the part about installing Mac OS on the pc, which isn't THAT far off if you have the right hardware, nothing else that he said is really that inaccurate.
I'm sorry if YOU can't see any value in a mac - you aren't looking very hard. Try loading OSX on your pc. Go ahead. I'll wait. Oh, make sure it is full functionality too. I want gestures, I want full printing and network support, everything. You say you have it? Prove it. Give me screen shots, video with audio, etc.
Did you not read the thread title? The op was specifically asking for people's opinions and what they don't like. And that's exactly what he stated.
I'm sorry, but I loathe posts like yours. If you are so anti-mac, then good for you. Enjoy your world, but stay the hell out of ours.
Good grief, he didn't attack your mom. Your statement here, and really the entire post is uncalled for. He is well within the subject of the thread. If you don't believe so, report him and move on. If you don't like his reasoning, perhaps you are far to pro-Mac to be able to know the difference. Chill.
ender land
Apr 23, 10:11 PM
I'm not sure I understand the point in the first part of your post so I'll have to skip that for now. Maybe you can phrase it a different way to help me out. Anyway, the whole "moral" issue has been raised and argued before. In my mind, there are many reasons why, logically, atheists are, by far, more moral then religious people. I'll just throw one out at you: your statement of someone who is a practicing theist has a "standard" of morals to abide by isn't something I can agree with for many reasons. One, why does one have to have a religious book to have a standard of morals. Atheists can know right and wrong and make laws based on common sense morals. We don't need some made up god to tell us what is right and wrong. Secondly, have you read some of the "morals" in the holy books. If so, and you still follow these rules, you have very low standards for what good morals should be. One needs to look no further then the section on how to treat your slaves in the bible to see this fact!
Ugh, so much ignorance (hopefully unintentional), I don't know where to start...
If you are theistic, clearly it would make sense to base morality off what your God believes. Not doing so would be the equivalent of an atheist not agreeing with the scientific method.
Everything you say is hinged upon the belief religions are all wrong. If this is in fact true, I suppose you having this belief is true. Though you could also debate this back and forth, IF religion is all wrong, any religious morals are therefore created by those who practiced/invented the religion, which means there are far more viewpoints having gone into the creation of such morals.
Thirdly, it doesn't even matter whether the above is true with respect to what you said, even if religion is 100% made up, people who are religious (I'll pick on GWB again since he was by far more practicing Christian than Obama) are still basing their beliefs on something which is written down. This makes them more trustworthy, or perhaps a better word would be predictable. It is unlikely that someone like GWB will suddenly ever go "you know what, I think you're right, it's totally ok to allow abortion" because his beliefs are based on something which will not change. On the other hand, a politician who is completely atheistic has no such 'check' or 'reference' which means you have no idea that their position will not change.
"Common sense morals?" lol! There are so many examples of morals not being "common sense" both inside and outside theistic cultures. These "common sense" morals are only common sense because you personally believe in them, at the current time, given your set of circumstances. It is entirely possible they drastically change over time. A great example is the one you pointed out, slavery. Plenty of people thought it was "common sense" to allow slavery. What changed? Did people suddenly get "more common sense?" It seems likely to me that something like abortion is likely to eventually become a "common sense to outlaw" thing, while gay marriage will become a "wtf does the government care" common sense thing; neither of these is the current state in the United States.
Not to mention, common sense morals more or less is exactly what I am referring to when saying societal morals. The "this is morality as we see it, duh!" type of morality.
Regarding your final point, I am almost positive I have read more of the Bible and understand what it is saying better than you. I am not going to debate a book you seemingly do not know with you, so I will offer this: there is a difference between Old Testament law and the New Testament in terms of how we, ie not Jews living more than 2300 years ago, should interpret them in our daily lives. Not to mention, much of the Old Testament was written to a specific group of people at a specific time (that was a long time ago), which even if New Testament did not "free" us from Old Testament law, that slavery was much different at the time in practice and implementation (see Leviticus 25). Plus if you do want to see how to treat slaves from a Biblical standpoint, in light of Christ, read the book of Philemon in the New Testament, which specifically is written to a slaveowner from Paul.
Ugh, so much ignorance (hopefully unintentional), I don't know where to start...
If you are theistic, clearly it would make sense to base morality off what your God believes. Not doing so would be the equivalent of an atheist not agreeing with the scientific method.
Everything you say is hinged upon the belief religions are all wrong. If this is in fact true, I suppose you having this belief is true. Though you could also debate this back and forth, IF religion is all wrong, any religious morals are therefore created by those who practiced/invented the religion, which means there are far more viewpoints having gone into the creation of such morals.
Thirdly, it doesn't even matter whether the above is true with respect to what you said, even if religion is 100% made up, people who are religious (I'll pick on GWB again since he was by far more practicing Christian than Obama) are still basing their beliefs on something which is written down. This makes them more trustworthy, or perhaps a better word would be predictable. It is unlikely that someone like GWB will suddenly ever go "you know what, I think you're right, it's totally ok to allow abortion" because his beliefs are based on something which will not change. On the other hand, a politician who is completely atheistic has no such 'check' or 'reference' which means you have no idea that their position will not change.
"Common sense morals?" lol! There are so many examples of morals not being "common sense" both inside and outside theistic cultures. These "common sense" morals are only common sense because you personally believe in them, at the current time, given your set of circumstances. It is entirely possible they drastically change over time. A great example is the one you pointed out, slavery. Plenty of people thought it was "common sense" to allow slavery. What changed? Did people suddenly get "more common sense?" It seems likely to me that something like abortion is likely to eventually become a "common sense to outlaw" thing, while gay marriage will become a "wtf does the government care" common sense thing; neither of these is the current state in the United States.
Not to mention, common sense morals more or less is exactly what I am referring to when saying societal morals. The "this is morality as we see it, duh!" type of morality.
Regarding your final point, I am almost positive I have read more of the Bible and understand what it is saying better than you. I am not going to debate a book you seemingly do not know with you, so I will offer this: there is a difference between Old Testament law and the New Testament in terms of how we, ie not Jews living more than 2300 years ago, should interpret them in our daily lives. Not to mention, much of the Old Testament was written to a specific group of people at a specific time (that was a long time ago), which even if New Testament did not "free" us from Old Testament law, that slavery was much different at the time in practice and implementation (see Leviticus 25). Plus if you do want to see how to treat slaves from a Biblical standpoint, in light of Christ, read the book of Philemon in the New Testament, which specifically is written to a slaveowner from Paul.
Gelfin
Apr 24, 03:03 PM
In answer to the OP's question, I have long harbored the suspicion (without any clear idea how to test it) that human beings have evolved their penchant for accepting nonsense. On the face of it, accepting that which does not correspond with reality is a very costly behavior. Animals that believe they need to sacrifice part of their food supply should be that much less likely to survive than those without that belief.
My hunch, however, is that the willingness to play along with certain kinds of nonsense games, including religion and other ritualized activities, is a social bonding mechanism in humans so deeply ingrained that it is difficult for us to step outside ourselves and recognize it for a game. One's willingness to play along with the rituals of a culture signifies that his need to be a part of the community is stronger than his need for rational justification. Consenting to accept a manufactured truth is an act of submission. It generates social cohesion and establishes shibboleths. In a way it is a constant background radiation of codependence and enablement permeating human existence.
If I go way too far out on this particular limb, I actually suspect that the ability to prioritize rational justification over social submission is a more recent development than we realize, and that this development is still competing with the old instincts for social cohesion. Perhaps this is the reason that atheists and skeptics are typically considered more objectionable than those with differing religious or supernatural beliefs. Playing the game under slightly different rules seems less dangerous than refusing to play at all.
Think of the undertones of the intuitive stereotype many people have of skeptics: many people automatically imagine a sort of bristly, unfriendly loner who isn't really happy and is always trying to make other people unhappy too. There is really no factual basis for this caricature, and yet it is almost universal. On this account, when we become adults we do not stop playing games of make-believe. Instead we just start taking our games of make-believe very seriously, and our intuitive sense is that someone who rejects our games is rejecting us. Such a person feels untrustworthy in a way we would find hard to justify.
Religions are hardly the only source of this sort of game. I suspect they are everywhere, often too subtle to notice, but religions are by far the largest, oldest, most obtrusive example.
My hunch, however, is that the willingness to play along with certain kinds of nonsense games, including religion and other ritualized activities, is a social bonding mechanism in humans so deeply ingrained that it is difficult for us to step outside ourselves and recognize it for a game. One's willingness to play along with the rituals of a culture signifies that his need to be a part of the community is stronger than his need for rational justification. Consenting to accept a manufactured truth is an act of submission. It generates social cohesion and establishes shibboleths. In a way it is a constant background radiation of codependence and enablement permeating human existence.
If I go way too far out on this particular limb, I actually suspect that the ability to prioritize rational justification over social submission is a more recent development than we realize, and that this development is still competing with the old instincts for social cohesion. Perhaps this is the reason that atheists and skeptics are typically considered more objectionable than those with differing religious or supernatural beliefs. Playing the game under slightly different rules seems less dangerous than refusing to play at all.
Think of the undertones of the intuitive stereotype many people have of skeptics: many people automatically imagine a sort of bristly, unfriendly loner who isn't really happy and is always trying to make other people unhappy too. There is really no factual basis for this caricature, and yet it is almost universal. On this account, when we become adults we do not stop playing games of make-believe. Instead we just start taking our games of make-believe very seriously, and our intuitive sense is that someone who rejects our games is rejecting us. Such a person feels untrustworthy in a way we would find hard to justify.
Religions are hardly the only source of this sort of game. I suspect they are everywhere, often too subtle to notice, but religions are by far the largest, oldest, most obtrusive example.
einmusiker
Mar 18, 01:16 PM
I'd like to see some kind of evidence that they can prove people are doing unauthorized tethering. You won't be seeing it so they really have nothing to charge you for. All we've heard so far is speculation and nothing more
0 comments:
Post a Comment