Friday, April 20, 2012

Does the iPhone 4 Really Have a "Retina Display"? (Updated)

[Updated 06/11/10, see below]

Dr. Raymond Soneira runs DisplayMate Technologies, which makes software to test display quality. He has a PhD in Theoretical Physics from Princeton University, and was a Long-Term Member of the Einstein Institute for Advanced Study in Princeton. (Read Dr. Soneira's Bio.) He also knows more about digital displays than just about anyone I know - and I know some pretty tech-savvy folks. This morning, Dr. Soneira shot me an interesting email regarding the so-called "Retina Display" of the iPhone 4. To clarify: a retina display is one whose resolution meets or exceeds the maximum resolution the human retina is capable of resolving, assuming perfect vision.

This is a bit tricky, since the eye doesn't have "pixels" and the resolution required to match the human eye's capability depends on the distance from your eye to the display. If you sit four feet away from a 50" 1080p television, you'll see pixels. If you sit 100 feet away, you won't. The distance between any two visual elements is a matter of how many pixels per "arc degree" of vision it covers. Dr. Soniera's email, in full and unedited, is as follows.

...

The iPhone 4 has an outstanding display... and I'm glad that Apple resisted the emotional rush to OLEDs because they still need lots of improvement before they will be ready to compete with the highly refined IPS LCDs. The iPhone 4 display should be comparable to the outstanding IPS LCD in the Motorola Droid, which I tested and compared to the Nexus One OLED, which was trounced by the Droid.

Steve Jobs claimed that the iPhone 4 has a resolution higher than the retina - that's not right:

1. The resolution of the retina is in angular measure - it's 50 Cycles Per Degree. A cycle is a line pair, which is two pixels, so the angular resolution of the eye is 0.6 arc minutes per pixel.

2. So if you hold an iPhone at the typical 12 inches from your eyes, that works out to 477 pixels per inch. At 8 inches it's 716 ppi. You have to hold it out 18 inches before it falls to 318 ppi.

So the iPhone has significantly lower resolution than the retina. It actually needs a resolution significantly higher than the retina in order to deliver an image that appears perfect to the retina.

It's a great display, most likely the best mobile display in production (and I can't wait to test it) but this is another example of spec exaggeration.

...

So there you have it - some math from a display expert showing that, while the iPhone 4's display is certainly exciting and probably represents a step forward for smartphones, it may fall short of Apple's claims of meeting or exceeding the resolution of the human retina.

Update 06/11/10: Dr. Soneira sent us some additional information, clarifying some of the misconceptions flowing around in comments and on other sites. He wishes to stress that his comments do not mean that he thinks the iPhone 4 display (or the phone itself) is bad. On the contrary, he thinks the display seems like a significant step forward. Dr. Soneira's comments are only regarding the claim Steve Jobs made of 300 pixels per inch being all that the retina can distinguish at a distance of "10 to 12 inches." Dr. Soneira's update is as follows

...

The iPhone 4 is actually very far from a retina display. It's a substantial discrepancy and not even close: At 12 inches the 1 dimensional linear difference is 326/477 = 68 percent. But the pixel (area) density for two dimensions, which is the actual relevant observable, is that value squared = 0.47, so the iPhone 4 is more than a factor of two from being a retina display at the typical 12 inch viewing distance. Stated another way: The iPhone display would need to have 1.3 megapixels instead of 0.6 megapixels to be a retina display.


There have been some comments that my analysis is for perfect vision. Jobs' statement is for the *retina* not the *eye* with a poor lens. If you allow poor vision to enter into the specs then any display becomes a retina display. That turns it into a meaningless concept that will be exploited by everyone. The iPhone 3GS is a retina display too for good percentage of the population.


Specs need to be objective, precise and accurate. Allowing puffery and exaggerations in the sales and marketing starts a snowballing effect that eventually leads to the 1000% rampant spec abuse that I document for many other displays.

...

Be sure to check out our other coverage of cell phones .

Follow Jason Cross on Twitter .


View the original article here

Another Apple iPhone 4 Flaw: A Glitchy Proximity Sensor

It happened to me on my first phone call with the new Apple iPhone 4: The display screen flashed on during the call, and I managed to inadvertently put the call onto speaker. Twice.

Now, I could crack a joke about having a talented cheek, but this isn't a joking matter: I never had these problems with my iPhone 3G or iPhone 3GS. I didn't feel as if I was holding the phone any differently; I even paid close attention over the course of subsequent e-mails, and confirmed I wasn't doing anything different.

What I did notice as the weekend wore on, however, was this was not a one-off occurrence. I regularly activated the touchscreen during a call. Typically, I managed to activate the keypad (and subsequently dialed numbers), mute button, or speaker; sometimes I ended up going into the contacts screen, or activating FaceTime (which in turn gave me an error message, given that I wasn't on Wi-Fi).

The clear suspect in this bizarre behavior appears to be the iPhone 4's proximity sensor, mouthful of a term that describes the sensor that detects your face's location relative to the screen, and enables or disables the display accordingly. On the iPhone 3GS, the proximity sensor was located to the left of the earpiece speaker. But that space on iPhone 4 is now occupied by the front-facing camera, and the proximity sensor is above the earpiece.

What's not clear is whether the iPhone 4 screen's misbehavior is due to the new location of the sensor, or it's because Apple tweaked the sensor's responses in any way. It could even be a combination of both.

In use, I observed out of the corner of my eye that the screen would blink on and off intermittently, depending upon how I held the phone. It was almost an assumption that I'd see this behavior if the phone slipped just slightly away from or up from my face. And it happened consistently whenever I rested the phone between my head and my shoulder-a common position, albeit one my neck never particularly appreciates.

(Separately, the new proximity sensor location is a major reason why you shouldn't attempt to use an iPhone 3GS case while waiting for an iPhone 4 case, even though the latest-gen cases are almost as hard to find right now as an actual device. It's also one of the reasons why many case designs remain in development and are not yet available, according to Ramsey Oten, CEO and case designer for Sena Cases. Notice that the earliest designs are either pouches, or form-hugging designs like Apple's own bumper, and similar designs from Sena and Incipio.)

While little official is known about this issue yet, I found it incredibly annoying to have my calls routinely interrupted. I asked around and found my colleague, Ginny Mies, had similar experiences. And some digging online shows Apple has an open discussion thread running 19 pages long, and counting. There, a user reports that an Apple Store Genius tech said it was "probably a software issue" but still put that user in line to swap out the phone when they get more in.

Sadly, I wouldn't get your hopes up on a swap helping matters. I'm already on my second handset, and have experienced the proximity sensor problem with both. Each was a "clean" install, meaning I didn't restore from a backup or anything else that might have impacted iOS 4's settings. To be fair, that first handset had other issues, too--the phone app froze up twice, each time requiring a reboot, and multiple times, the touchscreen didn't respond, period (among other things). But shortly into my second call with the second handset, the proximity sensor problem kicked in again, and I activated the keypad. In my experience, I'd say it's not an isolated hardware issue.

In these early days, it's not clear if every handset is affected--heck, many users have gone straight to using a Bluetooth headset or haven't really used the phone for conversation. Nonetheless, it is clear that this is yet another iPhone 4 launch problem Apple needs to address. Soon.

In fact, I'd put this call interruptus problem right up there with the reception issues. Yes, like so many others, I can hold the iPhone 4 in the so-called death grip and watch its signal strength deteriorate bar-by-bar, but I have not dropped calls because of this problem; I just drop calls in the same locations where my iPhone 3GS always dropped calls. At least the reception issue can be solved by using a case, something most of us will do, anyway, once cases become widely available. But short of using a headset--which no phone should require--the continuous and awkward call interruptions appear unavoidable until a fix comes along.

Have you experienced wonky behavior from your iPhone 4's touchscreen and proximity sensor? Tell us in the comments.


View the original article here

HP Envy 14 Now Available for Order

Back in early May, we wrote about the slew of new notebooks announced by HP. Some lines received bigger updates than others, but one of my favorites of the bunch was the Envy 14. Set to replace the end-of-life Envy 13, the new Envy notebooks clearly take their visual cues from Apple's MacBook Pro line. The new Envys are not only slick and pretty, but sport some pretty great hardware. Though the entry price of $1,099 for the 14-inch model is a little hard to swallow in a market where everyone wants a $599 notebook that does everything, it's worth noting that you get a lot more bang for the buck,hardware-wise, than a comparable MacBook Pro.

Consider that the 13" MacBook Pro starts at $1,199. For that extra $100, you get a similar CPU (2.4 GHz Core 2 Duo compared to the 2.4 GHz Core i3-370M in the Envy 14), equal RAM (4GB), a smaller hard drive (250GB vs. 320GB), and quite inferior graphics (GeForce 320M with 256MB of shared memory vs. Mobility Radeon 5650 with 1GB dedicated graphics memory). Oh, and the Envy has a higher-res screen as well (1600x900 vs. 1280x800), but then again, it's a little bit bigger. Choose enough upgrades to boost the price to the next-best MacBook Pro (the $1,499 13-inch model) and the difference in specs tilts even more heavily in the Envy 14's favor.

Of course, a good laptop is more than the sum of its hardware specs, and we haven't had a chance to review the Envy 14 just yet. We're anxious to put it through its paces and you'll see a review here as soon as we can. If you want to jump in before our review, you can order an Envy 14 from the HP store.

Follow Jason Cross on Twitter .


View the original article here

Samsung N230 Netbook Promises 13.5 Hours of Battery Life

Samsung announced a new netbook model today in the N230. At first blush, it doesn't seem like anything particularly special: a 10.1-inch screen with a resolution of 1024 by 600, Intel Atom N450 or N470 CPU, 802.11b/g/n Wi-Fi, and a weight of around 2.2 pounds. The eye-catching part is the company's claim that this netbook will last for 13.5 hours on a single charge. How is it achieving such astounding battery life from a regular Windows-running netbook? Samsung talks about their efficient LED display and "proprietary Enhanced Battery Life (EBL) solutions" in its press release, but upon closer examination we can see what's really going on...

The N230 netbook has a high, but not especially amazing, battery life of 7 hours with the standard battery. The 13.5 hour claim comes when you use the optional 65 watt-hour long-life battery. Samsung doesn't say exactly what this battery will do to the netbook's bulk or weight. Still, this is an impressive feat, if the real battery life is anywhere close to Samsung's claims. We have tested netbooks with extended batteries before, and none have quite come close to that sort of runtime. Then again, we often find the battery life claims of manufacturers to be a bit...optimistic...compared to our lab tests.

Samsung says the N230 is available now and should cost around $400, but we haven't seen it pop up on our favorite shopping sites just yet.

Check out our Top Netbooks Chart .

Follow Jason Cross on Twitter .


View the original article here

Thursday, April 19, 2012

Alienware M11x gets Core i5, i7 Upgrade and Nvidia Optimus Tech

Few gaming laptops have charmed us as much as the Alienware M11x. It's a bit bulky compared to other 11.6-inch ultraportable laptops, but absolutely tiny compared to most laptops designed for gaming. It's overclocked Core 2 Duo SU7300 processor and GeForce 335M mobile graphics chip give it the muscle needed to truly play all the latest games at high settings. With most notebooks that size, you have to turn the settings down pretty far to get decent performance. The big eight-cell prismatic battery gives it over 7 hours of working time in our tests, as long as you flip the switchable graphics over to the Intel integrated GPU.

Now, our favorite ultraportable gaming machine is getting even better. Starting later this month, the M11x will swap out the Core 2 Duo processor in favor of ultra-low voltage versions of Intel's Core i5 and Core i7. These chips won't have their default clock speeds raised, as the current version does, but these CPUs feature Intel's Turbo Boost technology that automatically overclocks the chips in certain situations. Alienware representatives tell us to expect a significant performance increase. Battery life should range from "the same" to "maybe 15 or 20 minutes less", depending on how you use the system.

The existing M11x features manual switchable graphics, where you enter a special keystroke to switch between Intel's low-performance but battery-friendly integrated graphics and the high-performance GeForce 335M discrete GPU. The new version will include the fantastic Nvidia Optimus technology, which automatically and invisibly switches between the two based on what application you're running.

The only other real change is a minor cosmetic tweak: The black version will feature a matte finish instead of the glossy black of today's model. The silver version will be identical.

Check out our Laptop Reviews.

Follow Jason Cross on Twitter.


View the original article here

First Look: Android 2.2 (Froyo) with Flash Player 10.1

I got an advance look at Google's latest treat for Android phones, Android 2.2 (more deliciously known as Froyo) on the Nexus One. Announced this morning at Google I/O in San Francisco, the update will initially be available to Motorola Droid and Nexus One owners in June. Android users will definitely be happy with this update, which delivers faster performance, tethering/mobile hotspot and of course, Flash support.

Flash Player 10.1: Great for Watching Video

At last, full Flash support has finally arrived on Android. Overall, the whole experience is quite good, but I encountered a couple of issues in my hands-on. Video playback looked excellent on the Nexus One's screen. I watched a couple of trailers on the Warner Brothers' site and was impressed with how smooth playback the was.

Flash support brings some big gaming potential to the Android platform. I tested the South Park Studios' make-your-own avatar feature and was amazed with how speedy the game was. Other games, like a baseball game on Kongregate ran smoothly as well. Farmville fans will also be delighted to learn that the ridiculously-addictive social networking game is Flash-based as well. Now you'll never get away from your farm.

Of course, not all sites were so fast. While the kid's educational site Ecoda Zoo looked gorgeous on the Nexus One, it moved painfully slow. I tried playing a couple of beloved Flash games that aren't optimized for mobile and was disappointed that I couldn't play some of them without a keyboard. For example, with Dino Run, I had to press the "space" bar to do a certain action, but I couldn't access the touch keyboard (the keyboard only comes up when you're in a typing field).

Interestingly enough, there's a shortcut on one of the homescreens to a page with recommended Flash-enabled sites and games including South Park Studios, BBC, Sony Pictures, Armor Games and more. TechCrunch dug up the lists for both the Nexus One and the Droid and pointed out that the two are different. In fact, the list for the Droid is much shorter than the Nexus One's.

Missing from all of this Flash action, of course, is Hulu. I was really disappointed when I tried-and ultimately failed-to watch an episode of "30 Rock" on the Nexus One. According to Adobe, Hulu does not own distribution rights for their content on mobile devices and therefore cannot stream video to smartphones. With no Hulu on the iPad and no Hulu on your Android phone, isn't time for Hulu to develop an app? Let's hope so.

Android 2.2: Faster Performance, Wi-Fi HotSpot and Tethering

While Flash Player is clearly the biggest update, there are definitely a few gems in the update. I did some side-by-side tests with a Nexus One running 2.1 and right off the bat, I noticed how much faster 2.2 is. Native apps launched quicker and scrolling through Web pages felt smoother on my 2.2 Nexus One.

I'm not sure how the carriers will handle this, but I was able to turn my T-Mobile Nexus One into a mobile hotspot via T-Mobile without any issue. Tethering also worked without any issue with T-Mobile.

One of the biggest weaknesses with Android was the inability to download apps to your microSD card; you had to resort to using your precious internal memory. Now, with the 2.2 update, you can store your apps on a microSD card.

Finally, there are also a few subtle cosmetic tweaks in the update. On the homescreen, there are three permanent shortcuts to the dialer app, the app menu and the browser. Éclair (2.0/2.1) only has a shortcut to the Menu app. Other than that, 2.2 looks pretty similar to 2.1.

The next treat for Android fans is Gingerbread, coming in Q4 of 2010. What do you want to see in the next version of Android? Leave your answer in the comments below.


View the original article here

Samsung Galaxy S: How Does It Measure Up to the Competition?

This spring, Samsung introduced the Samsung Galaxy S, a super Android smartphone to rival the HTC EVO 4G, the various Droids (both Motorola's and HTC's) and of course, the iPhone 4. Versions of the Galaxy S will be making its way to U.S. shores this summer in four different form factors to all four major U.S. carriers. I was lucky enough to get my hands on the original European Galaxy S and did some quick side-by-side comparisons with the other hot phones of the summer.

Design and Display

When I first picked up the Galaxy S, I was amazed with how thin and lightweight it was. I was also surprised by how familiar it looked. The design is actually very iPhone 3GS-like with an all black, shiny plastic body and minimal buttons on the phone's face. It is thinner than both the EVO 4G and the Droid X measuring 0.39-inches thick, but slightly beefier than the ultra-slim 0.37-inch iPhone 4. It is the lightest of the bunch, weighing a scant 4.2 ounces.

The Galaxy S's feather-light weight is due in part to the Super AMOLED technology, which the Samsung first introduced at Mobile World Congress on the Samsung Wave. Super AMOLED technology has touch sensors on the display itself as opposed to creating a separate layer (Samsung's old AMOLED displays had this extra layer) making it the thinnest display technology on the market. Super AMOLED is fantastic; you really have to see it in real life to experience it. Colors burst out of the display and animations appeared lively and smooth.

The Galaxy S' 4-inch display is larger than the iPhone's (3.5-inches), but smaller than the HTC EVO 4G and Motorola Droid X's displays (4.3-inches). Despite its smaller size, the Galaxy S outshined both the Droid X and the EVO 4G in my casual side-by-side comparisons. The side-by-side with the iPhone 4 was a closer call. The iPhone 4's display appeared slightly sharper, but I thought the Galaxy S's colors looked more natural. It is really hard to declare a winner--both displays are stunning.

Samsung TouchWiz 3.0 with Android 2.1

The Samsung Galaxy S runs Android 2.1 (Eclair) with Samsung's own TouchWiz 3.0 user interface. Overall, this version of TouchWiz is a lot better than the version on the Samsung Behold II for T-Mobile, which was slow and difficult to navigate. But while this version is an improvement, I encountered some familiar issues with TouchWiz 3.0. Despite the 1GHz Hummingbird processor, the phone lags when launching apps, flipping through menus and scrolling down contact lists or Web pages. This could be due to the fact that this is a pre-production unit, however, and not everything is in perfect working order.

Like HTC Sense, Samsung has its own social media aggregator. Social Hub combines streams from your Facebook, MySpace and Twitter accounts into a single view. It is a useful feature if you need a simple way to keep track of your networks. One odd feature is Mini Diary, which lets you create blog entries with photos, weather info, texts and more. This would be a great feature if you could actually sync this information to your blog or Facebook profile--but weirdly, you can't.

Camera

We put the Galaxy S's 5-megapixel camera through a modified version of our PCWorld Lab Test for point-and-shoot digital cameras along with the iPhone 4, the Motorola Droid X and the HTC EVO 4G. Unfortunately our test panel was not very impressed with the Galaxy's photo quality. The Galaxy S phone earned the lowest score out of the four and an overall word score of "Fair." It finished ahead of the Evo 4G in terms of exposure quality, but finished in last place in our color accuracy, sharpness, and distortion tests.

On the other hand, it took second place in overall video quality. Its performance was skewed heavily toward good performance in bright light. According to our panel, bright-light footage looked a bit underexposed and slightly grainy in a full-screen view, but great at smaller sizes. The Galaxy S's auto-focus searches a bit before locking onto a crisp image. Its microphone actually picks up audio a bit too well: our audio clip sounded far too loud and blown-out, while it was barely picked up at all by some of the other smartphones in this comparison. In low light, the footage was a bit too murky and undefined to earn a better rating. Read the full test results in our Smartphone Camera Battle: iPhone vs. the Android Army.

Keep an eye out for full reviews of the Samsung Galaxy S phones including the Samsung Epic 4G (Sprint), Samsung Vibrant (T-Mobile) and the Samsung Captivate and the Samsung Fascinate (Verizon).


View the original article here

Nvidia Unveils Next-Generation Ion Platform

We've been pretty big fans of Nvidia's Ion product for netbooks, which turbocharges the lame integrated graphics found in Intel's Atom line with something really capable of decoding all that hi-def flash video on the web and even playing a few basic 3D games. If you'll recall, the previous generation of Intel Atom based netbooks were three-chip solutions: you had the Atom CPU, the "North Bridge" containing the memory controller and integrated graphics, and the "South Bridge" with all the I/O and interconnect stuff. The Ion platform replaced both the North Bridge and South Bridge with what Nvidia calls an MCP - media and communcations processor. It's basically a single chip that includes the memory controller, I/O, and integrated graphics. In other words, the original Ion brought the three-chip Atom solution from Intel down to a two-chip solution, while improving graphics performance. It was a major selling point.

The Next Generation Ion, revealed today, sort of makes the hairs on the back of my neck stand up, not because it's a bad products, but because the marketing message isn't clear to consumers. It's no longer a "platform" - it's an add-on to an existing platform in the same way that any GeForce discrete graphics chip is an add-on to any notebook of any other size. It's just... GeForce for netbooks. For this and other reasons, the whole marketing message around the Next-Gen Ion is a little worrisome. Allow me to explain.

The new Intel "Pine Trail" platform for netbooks (those netbooks with the Atom N450 or N470 CPU) gets rid of the North Bridge chip - the Intel graphics and memory controller have been integrated into the CPU itself. So with Pine Trail, Intel is down to a two-chip solution: the CPU and the South Bridge. What the next-gen Ion does is boost that back to a three-chip solution by adding a GPU, complete with up to 512MB of it's own DDR2 or DDR3 memory (something that wasn't required in the original Ion, mind you). This graphics chip connects to the South Bridge via a PCI Express x1 link.

The new Ion will use the same Optimus technology Nvidia recently unveiled for larger notebooks. This is cool stuff. Basically, it's automatic switchable graphics that you, as a user, never need to even think about. Instead of the computer having an internal hardware switch to turn off the Intel integrated graphics and turn on the Nvidia discrete graphics, the computer simply always displays the Intel integrated graphics' frame buffer contents. The special Nvidia driver detects when you start viewing video or running 3D graphics and will power up the discrete graphics, copying the frames it renders to the Intel integrated graphics' frame buffer. Then, it automatically shuts off and powers down when you stop viewing video or running 3D. Optimus is really cool stuff, but the fact that the next-gen Ion uses it only further underscores that what once was a replacement "platform" for Atom-based netbooks is now an additional discrete graphics chip.

The Next-Generation Ion will come in two flavors. One has 8 graphics cores, the other 16. They're really the same chip, based on a very low-end version of the GT2xx family, with some parts "fused off" on the 8-core version to achieve the lower power and thermal requirements of smaller netbooks. My guess is that it's the GT218 chip, found in the GeForce 210M. The 8-core version can have either a 32-bit or 64-bit memory interface, while the 16-core version always has the 64-bit memory interface. You'll find the 8-core chip almost exclusively in 10-inch or smaller netbooks, while the 12-inch ultra-premium netbooks and desktop "nettops" will get the 16-core version, since they have a little more wiggle room on thermals and power. Nvidia says performance of the 8-core version should be comparable to the original Ion, while the 16-core version should be as much as twice as fast.

Here's another problem with the marketing: you don't know which one you're getting, really. One version of the chip is literally twice as fast as the other, but both are simply called "Next-Generation Ion." Nvidia tells us all the 12-inch netbooks and nettops are using the 16-core version, but there's nothing to force manufacturers to do this. If someone wants to make a super-thin 12-inch netbook and use the 8-core version, you really wouldn't know it unless the manufacturer spells out in the specs exactly which version of the "Next Generation Ion" it's using. This is like Nvidia selling two different GeForce mobile chips, one twice as powerful as the other, and not giving them different model numbers to differentiate each other. In fact, that's exactly what this is.

Nvidia is focusing on the "experience" of Ion, which is genuinely much better than the stock integrated graphics you get with an Atom based netbook or nettop. The video decoding acceleration is worlds better, as is the 3D graphics performance. That was true of the previous generation, too. Nvidia has made some hay with the press about how the new Ion has a package size of 23mm by 23mm, which is 40% the size of the origional Ion's package, thanks in part to a shrink from 65nm manufacturing to 40nm. This is a completely disingenuous comparison. The first-gen Ion was an "MCP" product, a complete GPU plus memory controller plus I/O controller in one. It replaced two Intel chips with one from Nvidia. The new Ion replaces nothing, it adds a 23mm by 23mm chip and a bank of dedicated graphics memory to the new two-chip Intel Pine Trail platform.

I'm all for giving users the ability to get away from the truly awful Intel integrated graphics and buy something better. The new Ion is definintely good news in that regard. But let's call a spade a spade - the new Ion is just a really low-end GeForce mobile discrete graphics chip for Ions. To be fair, Nvidia doesn't claim the Next Generation Ion is a "platform" in its marketing materials, as far as I can tell. But by keeping the same branding as the previous product, which is a platform, it confuses consumers about what they're actually buying.


View the original article here

Why I Switched from iPhone to Android

Last week, I joined what must be millions of other technology nerds (if my Twitter and Facebook friends are any indication) in getting rid of my iPhone 3G* in favor of an Android-based phone. Why on earth would I do such a thing? Aren't iPhones basically the best smartphones on the market? Increasingly, I'm not sure that's the case. Besides, it's not simply about overall phone quality.

The reasons I switched closely mirror those than Daniel Lyons outlined in his piece at Newsweek. Here's the breakdown of the reasons I jumped ship, and why I think many formerly loyal iPhone users might be jumping ship, too.

First, there's AT&T. I live and work in San Francisco, which is basically ground zero for crappy AT&T service. I was tired of the dropped calls, but I don't talk on the phone all that much. The bigger problem was having "four bars" of 3G service, trying to go to a website, and being told there was no network connection. I can't count the times I've reloaded a web page or TweetDeck trying to get my seemingly well-connected phone online. My contract with AT&T was over, so this was a good opportunity to jump ship to Verizon. I don't really care if Verizon's 3G isn't quite as fast as AT&T 3G. Slightly slower but reliable beats faster and spotty every time. (This, by the way, is why carriers and phone vendors should cut it out with the exclusivity deals. When AT&T loses a customer, so does Apple. When Apple loses a customer, chances are high that AT&T does, too.)

Then we have Apple's app store policies. Apple is changing the terms in their OS 4 update to the iPhone (coming this summer) to basically disallow any intermediate software layers in the creation of iPhone apps. This means devs can't use Adobe's popular Flash-to-iPhone compiler, nor products like MonoTouch. The Unity 3D engine may or may not be affected. Is it Apple's right to do this? Maybe, but I don't really care. Apple's official reason is that intermediate software layers produce sub-standard products. The sorry state of iTunes on Windows, which uses CoreFoundation and CoreGraphics, might prove their point. But shouldn't developers and consumers be the ones to decide if software is crappy or not? And if Apple is so concerned about software quality, how come so many Apps make it to the App Store in an almost unusably buggy state? How come there are so many completely worthless junk apps? Apple's quality concerns are demonstrably B.S.

Apple also refuses to support Flash in its browser. Fair enough. Maybe the future of web video and interactive entertainment is HTML5, but the now of web video and interactive entertainment is Flash. Video sites that rely on protecting content can't use HTML5 video yet, and HTML5 is a long way from having the tools or penetration necessary to make the equivalent of Flash's incredibly popular web games. Google went ahead and demonstrated how well Flash can run on a phone - Apple claims they give you the "whole web" on iPhone and iPad, but Google is actually delivering it.

Which brings me to Froyo (Google's cute name for Android 2.2). I'm mighty impressed by what Google is doing here. It's very fast, has some great new developer features, integrated honest-to-goodness Flash 10.1 without compromises, tethering, and more. Of course, iPhone OS 4 brings with it a host of big changes, and it looks like video chat will probably be part of that. But I'd have to buy a new iPhone, and that may mean sticking with AT&T. The only problem is, I don't have any confidence that Apple will implement video chat in some sort of standards-compliant way. I feel like video chat is likely to be iPhone-to-iPhone only, or maybe to Macs with iChat.

Ultimately, my reason for switching can be summed up thusly: I used to feel that, to get the best smartphone software and hardware experience, I had to live in Apple's walled garden. Now, the walls are getting higher, and life outside the garden looks better and better. I can get a really great smartphone without some company telling me I can't switch out the keyboard, or the dialer, or the voice mail program, or the browser. I can get a world-class smartphone without putting up with AT&T's spotty network. I don't have to put up with supporting a company that enforces its restrictive App Store policies in a seemingly arbitrary and draconian manner. I'm not sure I agree with those who say Google has "leapfrogged" Apple in phone development, but I certainly think they're doing a comparably good job.

So, last week, I walked into Best Buy and bought an HTC Droid Incredible, and so far I've been more than happy with it. Now if only more game developers would flock to Android as customers seem to have done. Oh well, I still have my iPad for that (I'd buy someone's else's tablet if anyone was making a tablet nearly as good as the iPad).

Follow Jason Cross on Twitter .

* I didn't actually get rid of it. I still have it, it's just not my phone. I'll hang on to it as a portable game machine, for now.


View the original article here

Wednesday, April 18, 2012

Lenovo Introduces Media-Savvy IdeaPad Y560

When you mention Lenovo, you usually call to mind business-oriented laptops. The ThinkPad and IdeaPad lines have been staples of the business traveler - simple, black, and sort of boring, but priced right and very easy to work on. We haven't met a Lenovo keyboard or touchpad we didn't like. But a Lenovo laptop for media-minded consumers? Even for gamers? Surely that's heresy, right?

Today, Lenovo has made available the IdeaPad Y560, a product that might just get general consumers, and even gamers, to reconsider the brand. It has a slicker industrial design, with red accents and a textured lid, but it's what's inside that really has me intrigued. The entry-level model features a Core i3 330M CPU, 4GB of DDR3 memory, and a quite capable Radeon Mobility HD 5730 discrete graphics chip with 1GB of video RAM. Not bad for under $1,000. You can step up to Core i5 dual-core and Core i7 quad-core CPUs from there, or add bigger 500GB 7200 RPM hard drives if you're wiling to spend $1,299. The top-end model boost the RAM to 8GB and is a bit pricey at $1,599.

So how about it? Would you consider Lenovo when buying a media-centric consumer laptop, or will the company never shake it's "business-only" image?

Read our reviews of Lenovo laptops.

Follow Jason on Twitter.


View the original article here

Intel Launches Ultra-Low Voltage Core i3, i5 and i7 CPUs

Intel has officially launched Ultra-Low Voltage (ULV) processors in the Core i3, Core i5, and Core i7 product families today. The company claims the chips offer up to 32% better performance than the comparable ULV processors in the Core 2 family, popular in many ultraportable PCs. At the same time, power usage is reduced by a promised 15%, again relative to the company's current ULV products.

The branding gets a little confusing at this point. The Core i5-520UM, for example, runs at 1.066 GHz and carries at TDP (thermal design power) of just 18 watts. The Core i7-640UM runs at 1.2GHz, and like many other mobile Core i7 processors, has two cores. It may be a little confusing for consumers to see Core i7 in the specs of one laptop which is significantly less powerful than a Core i5, simply because the Core i7 is the ULV version and the Core i5 is not (standard Core i5 mobile processors run up to 2.53GHz). Note that the ULV Core i3, i5, and i7 mobile processors only officially support DDR3 at speeds up to 800MHz, while the standard versions also support 1066MHz DDR3 memory, so there could be a significant difference in memory bandwidth as well.

Having said that, the new ULV Core i3, i5, and i7 processors should provide a significant boost in performance and even a modest improvement in battery life over the existing ULV Core 2 Duo processors. If you're in the market for a really thin and light ultraportable laptop, it behooves you to wait a month or two for the laptops using these chips to hit the market. We even hear rumors that the stellar Alienware M11x will get an upgrade to these new CPUs.

You can read the official press release here.

Follow Jason Cross on Twitter.


View the original article here