Author Topic: OT "If iPhone 6 Were Actually Better"  (Read 15175 times)

0 Members and 1 Guest are viewing this topic.

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #25 on: October 07, 2014, 08:21:26 pm »
I honestly can't think of anything that an iPhone does significantly better than say a Nexus 5 or Galaxy S5 or Moto X or LG G3. Can you explain what you mean?

That's hard to quantify, honestly.  One thing I've noticed is that I tend to think like Apple.  (Or at least I did, back in the Jobs / Ive / Forstall period.)  So, the UI design, the hardware design, all that stuff just seems right to me.  When I pick up a Galaxy, I don't get the same instant connection.  It might sound stupid, but it has always been this way.  That's the gist of it.  If you have any interest, here's a little more detail:

Years ago in the iPhone / iPhone 3G era, I was convinced I would rather have a phone that lasts a week on a charge, than a smartphone that you have to plug in every day.  Then a coworker got an iPhone 3G, and we used it to navigate on a business trip to Washington DC.  I had a work Blackberry at the time (which I liked), in addition to my own Motorola Razr (which I also liked).  After about an hour from first contact, I decided I was ready to dump both for the iPhone.

The openness of the Android platform was a strong motivation to try it out.  I am all about Linux and OSS and all that.  I don't like being subjected to iTunes, not having an SD card slot, and having little freedom in terms of third-party app distribution and stuff like that.  However, most of those decisions were at least in part made to keep the platform pure.  (Admittedly, I believe the SD card thing is all about revenue, with maybe a touch of "no unessential external ports" obsessiveness on the part of Ive and Jobs.)  The walled garden app store, the lack of exposure of the underlying system -- those things keep unskilled hands off the guts that can be damaged.  And it works, really really well.  It's a double-edged sword, but I get it.  There's a reputation at stake.  I admire the huevos rancheros that Jobs had when he made those "if we can't do it right, we're not going to do it" decisions, against public opinion.

Also, the fact that "it just works" is the reason I recommended a MacBook and iPhone to my mom.  She took to it right away, despite not being very tech literate, and I hardly have to help her with problems.  With an Android and/or PC, I would have to walk her through a lot of stuff.  At least on the PC side, I can say that with certainty.  She's used a PC for years (she was even slightly concerned about moving to Mac), and I've had to fix PCs for her for years.  I've only had to help her figure out a couple things on the Mac.  That says "good design" to me.

It's not just about ludite-friendliness though.  There's an attention to detail that makes the experience seem polished.  Some things slip -- like the antenna on the 4, the structural rigidity on the 6 -- and it can be difficult to understand those gaping holes get through while everything else is so refined.  It's a weird contradiction, and I'm afraid without Jobs and Forstall, it might happen more often.  iOS 7 is clearly not as refined as 6, and the large phone thing is another step away from the focus on proportion and balance, to public demand.  I question those changes, because although I don't always like the consequences, I did and do appreciate their determination to do it well, or not at all.

In comparison, when I pick up a plastic phone with the fastest quad-core ARM they could stuff in the thing, I don't see attention to detail.  I see marketing.  (And yes, I wholly appreciate the irony there.)  I expect there are no less than a dozen things that don't work quite right because they were more concerned with feature quantity than quality.  To be fair, I haven't owned an Android for the same length of time, to really get to know the OS and the hardware, but I have used phones and tablets that friends and coworkers have bought.  I have never felt like they were well-designed or polished.

Likewise, I continued using Google Maps when Apple decided to cut their Google ties and make their own Maps app.  Now, I find myself using Apple Maps again, because Google's insistence on hiding features behind gestures without any visual clue they exist, is annoying.  They're also constantly fiddling with things (like YT comments, or the Drive app), so I never know if something I did before will still work now.  Yes, these things can be learned, and I'm not opposed to change, but I do get a little frustrated when I'm at a red light and need directions and have to re-learn how to use the friggin GPS.  By contrast, Jobs had this criteria in mind:  You should "just know how to use it."  I study user interface design from time to time (not intensively, just for fun), and when I heard him say that, I thought, "there's someone who gets it."  Google seems to be heading in the exact opposite direction.  "Wouldn't it be cool if you spun the phone on your fingertip like a basketball to get into Street View??  Let's do THAT!"  Ugh.

This is probably going to come across as fanboism, and there's probably little I can say to counter that.  I respected Jobs, I appreciate the attention to detail and the constant drive toward perfection that he had.  I understood the realities of market share and the resulting decisions to try and lock folks into a platform.  (I didn't like it, but I understood it.)  I respect Ives, and appreciate the balance he strikes between cutting edge features and features that work.  And, despite apparent social issues, I also respected Forstall's UI design skill.  That was a killer combo, and those guys did amazing work.  I do not just "accept" everything they do/did as "perfect" like a sheep.  I do not know if the current team has the same blend of traits required to continue Apple's legacy.  But at least in the past, the resulting products have been exactly the right combination of things for me.  That is all.
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #26 on: October 07, 2014, 08:22:16 pm »
 

n45048

  • Guest
Re: OT "If iPhone 6 Were Actually Better"
« Reply #27 on: October 08, 2014, 06:20:13 am »
My personal experience with Apple products is that they are hardly innovative or revolutionary.

OK, I admit, the first iPhone set the benchmark pretty high when it came to smartphones. Previous to that, we were left with the likes of Symbian, Windows CE/Mobile and PalmOS. But Android was well into development so other manufacturers quickly followed suit.

My biggest bugbear with Apple devices is the features they deliberately leave out over the course of their various models (in both hardware and software). First it was 3G, then LTE/4G didn't show up until reasonably late. No external storage support, no NFC, no built-in inductive charging, no multi-tasking, not even copy/paste at one point! The list goes on. Then it's their proprietary connectors not once, but twice!

I also have to say I've watched the physical build quality and design undergo steady decline yet prices get higher and higher.

I don't believe there is one handset out there that does everything, but in Apple's case, their product doesn't even come close. I really view it as just a very expensive toy.
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #28 on: October 09, 2014, 09:13:05 pm »
So the vast changes that came with iOS 7, which basically copied Android as best they could, didn't annoy you?

iOS 7 shook my confidence considerably.  When I first installed it, I was pretty certain that would be my last iPhone.  I've since settled down a little, and I'm willing to give the iPhone 6 with iOS 8 a try -- mostly because I've always really, really liked the iPhone; always been luke-warm about any Android device I've used; and have a (somewhat) significant investment in apps that tie me to the platform.

But, if the 8-on-6 combo doesn't work out, I will have to consider other options.

I know what you mean here. I got my mum a Chromebook and she gets on well with it. Can't get her to move on from her Nokia dumbphone though.

I strongly considered this, but decided she would want a "real" computer still.

The thing is, I wouldn't say that is exactly a glowing recommendation for the MacBook or iPhone. You are basically saying it's a beginner's device, a Fischer Price My First Phone rather than a serious tool. I had a plastic hammer that was really easy to use when I was a child, but that doesn't make it a good tool for driving in nails.

Not at all.  IMO, OS X is the perfect compromise.  It's intuitive, so it doesn't take long to figure out.  It's practically maintenance free.  (The first time I used OS X, I Googled for a while to figure out how to install device drivers, only to eventually find out -- "you don't, it just works."  That was hard to accept, coming from Windows.)

OTOH, it has a bash shell and POSIX compatibility, so it's almost a Linux replacement as well.  The dual-personality of a simple, functional, pretty desktop plus all the power of BSD and GNU software... pretty compelling combo.

My SO is what you might call a "power user", in that she's not super technical (wouldn't be able to tell you by a picture if a connector was SATA or Ultra 160 SCSI), but she's always finding productivity tweaks and learning gestures and otherwise making life easier for herself.  She loves OS X -- swiping, two-finger scrolling, all that stuff.  My main exposure is through a little time on her Air, and with a couple Hackintoshes I've built, so I don't have it down like she does, but I appreciate what I see.

It's okay if you happen to really like the way Apple forces you to do everything, but if you want to do something a different way or want to do something they don't allow then too bad. It's ironic coming from the company that did the famous 1984 ad and had "think different" as their slogan for years, because their device is all about conformity and appealing to the lowest common denominator.

You're right and wrong there, IMO.  I don't like being "forced" into anything, and yes Apple exerts some control in ways that I don't appreciate.  Particularly on the iPhone, but not so much on OS X -- with the one (albeit large) exception that it's only officially available for Apple OEM hardware.  However, like I said before, there's often real and tangible benefits to their stubbornness -- namely, it's a well-groomed system that doesn't lend itself to being rendered unusable by ignorant users.  I don't mean to be condescending with that statement, but ultimately, virii and broken system software tend to be the result of people poking where they shouldn't.  Take away that opportunity and those problems aren't as prevalent.  So, while the shackles chafe, I am forced to understand the motive behind those decisions.  (And yes, as I also said before, there are vendor lock-in incentives as well.  I totally acknowledge that.)

For that matter, MS is 100% as guilty.  C#, .NET, Office, IE, Exchange, Active Directory... they do everything they can to trap you into their world on the desktop, and if you'll notice, both Android and WinMo have since adopted the Store paradigm, because it works really well.  Albeit, maybe not as tightly controlled.


Really? Talking about Android specifically, what would she have been unable to do without help? All the basic functions are as easy to use as on iOS as far as I can tell. Some manufacturers, like Samsung, even provide a basic "n00b mode" that disables a lot of the customization options and make it more like an iPhone, although personally I think it's pointless.

I'm not sure I can argue convincingly about this, but having used an Android, I saw the opportunities for complexity and thought -- nope.  Walled gardens, iTunes sync, iPhoto, and Time Machine is exactly what she needs.  I have two older brothers -- one is more freedom-conscious than me (strictly Android for him) and also suggested going Apple.  The other, not being so tech-centric, agreed on the iPhone, but having no exposure to OS X in his regular job and home life, doesn't care much for the MacBook.

Again, I'm surprised. When I have seen people move from a PC environment to a Mac they are typically quite confused at first. There is a lot of non-intuitive stuff, like dragging CDs to the bin to eject them. They usually ask "won't that delete/erase it?" Having menus at the top of the screen throws a lot of people off too, especially since they are merged with the system menus. I have seen people get confused as to which are system menus and which are part of the application. The way apps are started is confusing too, because some are on the launch bar thing at the bottom and some are hidden away in the Applications folder, and there is no start menu or other obvious place where they are all listed and available. The three little coloured blobs on every window are completely meaningless unless you already know what they do too, and I often see new users completely ignoring them and ending up with a very disorganized desktop full of windows they tried to manually maximize.

OK, I'll agree that PC to Mac conversion is a little rocky at first because you have to unlearn a lot of habits.  However, once you get that Day One walkthrough, the reaction is often "Oh, well that's pretty easy."  Because it is... it just makes sense.  The Windows way is often obfuscated and took years of training.  I'll take some of your examples here:

- Dragging disks to the trash to unmount:  OK, yeah that's kinda weird and counter-intuitive.  I usually don't do this, nor do I use that technique when explaining the process to someone else.  If there's a two-button mouse, I suggest "right-click and choose eject".  Or click once and hit the menu bar.  Or use the requisite single-mouse-button equivalent (click-hold or whatever it is -- remember, I'm a Mac-on-PC Hardware user mostly, so I'm accustomed to two buttons.)  Objectively, is Windows' Safely Remove Hardware tray icon any more obvious?  (Try to imagine you're not already familiar with it..)

- Menus at the top of the screen:  I have no real opinion on this.  There is maybe a disconnect between the app window and the menu changing (sometimes subtly) well above it.  This is a relic of an era with smaller screens.  Good or bad, I don't know -- it just is.  Not insurmountable to learn.

- Apps on the launch bar:  This is but one of many ways.  It's a lot like Quick Launch, only IMO, a much better execution.  Put your common apps on there, and many users will never have a need to use Applications.  Pinning things is easy, organizing the menu is easy, installation and app removal is often as simple as "drag from here to there" and "delete the icon in Applications", which is faaaaaaaaaaar ahead of Windows....  There's also Mission Control and Launchpad for other ways.  Learn which ever combination works for you.  Like Perl, "There's More Than One Way To Do It".  To some, that's a plus.  Others, not so much.  Can't please everyone.

- Colored blobs:  No one instinctively knew what the _ [_] X icons did either.  The first time I used Win 3.0, I minimized a Paint window and thought I had closed it without being given a chance to save my drawing.  Everyone goes through that once.  On the Mac window, when you mouse over the bubbles, you get - + X (IIRC) which are equivalent to the Windows buttons.  If you came from Windows, it should be fairly clear what they mean.  Clicking on them the first time will validate that assumption.  You also get (by default) the little Genie animation showing you the app window getting sucked into the icon on the launch bar, so you have some clue as to where it went.  (Windows gives you the window shrinking animation as well, so we're even.)

- Disorganized desktops:  For sure, you can lead a horse to water, but you can't make him understand window management.  There's really only so much you can do.  I think this was one of the primary reasons why, when Apple introduced the iPhone (and pretty much taught the market how to design smartphones, and later tablets), they did not bother trying to implement multi-window interfaces.  One screen, one app.  Simple.  Users kicked and screamed, and other platforms tried to distinguish themselves as allowing "real multitasking", but for the most part, mobile devices aren't meant to be desktops, and vice versa.  (Biggest reason I feel Windows' recent brute-force attempt to blend the two is misguided.)

Sorry, but that just seems like simple bias to me. Look at how many problems iOS 8 has had. Really fundamental stuff like wifi not working. User's data being randomly deleted without warning. Apple are just as disaster prone as anyone else.

At this point, I agree with you.  iOS 7 and 8 both launched with major issues.  Some of them still linger.  It breaks my heart a little, because I really had the impression Apple had HAD IT with the stagnant mobile and PC markets, and was going to successfully show the world how to do it right.  For a while they did.  Now, they're struggling.  For all their flaws, they really did put a lot of effort into innovating the user-machine interface.  Not that a lot of it hadn't been done before, just not at the right time, with the right hardware and software, and with acceptable results.

Personally, I find the iPhone design quite unappealing. It doesn't seem high end to me, just average. It definitely goes for form over function, driven by marketing. Where is the grippy material on the back to make it easier to hold? Using a custom cable screams gouging to me and low end, because its usually the guys who give the hardware away too cheap and then make money on accessories (the razor blade model) that do that. I fully accept that's just my opinion of course.

Yep, and all valid.  I just disagree.  An iPhone looks and feels like bar-setting engineering.  All the other stuff seems like practical engineering, with attempts to win based on raw numbers.  Not my cup of tea.  To each his own.

I see it rather differently. He did strive for perfection, but only at launch. Every year a new iPhone came out, and every year it had to have a big new "wow" feature. The problem is they never bothered to keep up development and stay ahead of the game, so within six months it was surpassed by all the other manufacturers. Take Siri for example. It's not very competitive any more, having been surpassed first by Google Voice and then Cortana years ago. They were just lurching from one ultra-hyped stunt to the next. They also missed out by refusing to do certain things, like the iPad Mini which subsequently sold very well.

Again, this is probably subjective territory, but I always felt he made the right call.  Someone mentioned not having cut and paste for a long while.  True, and that had to be frustrating.  (I hopped on board just after it had been implemented.)  I remember watching a video at the time where Jobs had answered that query by saying "we didn't know how to do it in a way that worked well yet."  So they left it out.  For me, that's ballsy and earned a lot of cred.  In a market stuffed to the brim with half-baked features, that seemed like the best possible approach.  It wasn't unusable w/o cut and paste, they knew they would figure something out, and when they did it worked really well.  Bravo.

Likewise, most (not all!) of the improvements along the way followed the pattern.  LTE: Coverage was so-so and the radio ICs used a ton of power.  Result: Unacceptable hit to battery life.  Chuck it, stick with 3G for another iteration.  Security:  Passwords are a PITA -- such that many people will forego them in favor of convenience, yet they store no end of personal data on a device that can be lost at any time.  Two answers -- one, Find My iPhone with support for remote tracking, lock, and wipe, and later, fingerprint ID that actually works.

While the competition has gotten much better, they also paved new ground with Retina and a really decent speaker for its size.  Others are now out-pacing Apple, although I'm not sold on the need for 1080p+ on a small screen yet.  Similarly, the color reproduction has always been really good, although others have since taken it a notch beyond.  Before Apple, we were surrounded by Nokia displays with countable pixels and horrific color; bad to truly awful cameras; horrible speakerphones, not to even speak of music reproduction...  I would much rather listen to music on an iPhone speaker than the last HTC I heard.

Point is, there is often "as good" or even "better"... but there have rarely been as many phones that do it as consistently.  There may not be more there than some of the competition, but what's there is likely to be consistently good all around.  (Caveat:  I have not shopped for a phone since I bought my 4, so my impressions on the last few years of devices have been brief encounters.  Hardly enough to consider even my own opinion authoritative WRT what's available now.)

I feel some sympathy for Jobs due to the way he fell for the holistic medicine and spiritual healing scams, but it doesn't really suggest he was some kind of genius. Just very driven and demanding of his engineers.

I'll agree that his engineers are to thank for leaps and bounds made over the last several years.  Remarkable feats over and over again.  And a lot of it is technology other companies made (like LG, Samsung, Broadcom, etc etc etc...)  But to discount Jobs' determination is unfair.  It's that drive and demand that pushed the technology to complete his vision.  That is no small thing, and is what some other companies severely lack -- particularly Microsoft, at the moment.  They are a prime example of what technology withOUT vision leads to.

To put it another way, as technical folks, you and I and the rest of us here will have preconceived notions of what is doable, now.  We, as a group, really need someone with their head in the clouds to pull us from our myopic vision of reality.  Sometimes those guys are right.  That's why I keep bringing up the Jobs / Ive / Forstall trio.  The three of them, together, is what made Apple great.  They pulled each other forward when their individual notions were stuck in place.  I'm a little concerned what's going to happen now, but we'll see.
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #29 on: October 10, 2014, 12:42:29 am »
Well, this is getting into "I call BS" territory pretty quickly, so it might be about time to agree to disagree.

FWIW, you raise several points about Apple not being the first, not being the best, and not actually inventing the technology -- all of which I don't dispute.  But:  They have enough clout to special-order stuff, and then they mainstream it, and before long the entire industry is doing it.  Yes, often, there are those few examples of other companies that were doing it first, but whether it's marketing, or just getting the formula right, it doesn't seem to "take hold" until Apple does it.  Arguing over whether that's just the pull of a large company with a giant marketing engine and a huge cult following, or a leader who deserves credit where it's due, is somewhat pointless IMO.

Before Apple, there were no MP3 players that I would consider buying.  Mind you, I was pacing, and watching the space closely.  Someone pointed out the other day that Creative had the Zen (or whatever it was called) that used hard disks before the iPod.  It was also clunky and had terrible software.  But the iPod turned the industry into what it became, and (before I was at all a fan of Apple), the iPod was the first file-based player that I felt got it right.

Same story with the iPhone.  I was incredibly impressed by it, and I wasn't "biased" because I really wanted nothing to do with it... until I used one.  Then I was hooked.

Same with OS X.  Perfectly happy with Windows... or... at least Windows and Linux.  Then I started using OS X and thought, wow, that's a really nice OS.  (An opinion that illicited a response of "huh??  YOU like Mac OS?" from anyone I knew at the time.)

When I first got to try a Chromebook, I was impressed with that also.  Considered getting one myself... then starting thinking about how I would hack it and maybe run Gentoo instead.  That's when I realized that I didn't want a Chromebook, I just wanted a small, thin laptop.  OTOH, when I used an Air for the first time, I thought "nice hardware... bet I could put Gentoo on it.... or maybe I could dual-boot.... ...  ... ... or maybe I'll just use it as-is.  This is kind of nice..."

I don't know how much more objective I can be.  I feel like I give everything a fair shake.  I'll admit, I can't bring myself to try a Windows Phone.  I just can't.  There, I am indeed biased.  But for Android, I keep going back and trying it, but I always end up thinking.. "nahh... I like the iPhone / iOS better.  Maybe next time."  Similarly, my main PC runs Windows.  My laptop dual-boots with Linux.  My work PC is Linux.  But I'm very tempted to start moving what I can to X.


Now, because I don't want to be misquoted or really even misunderstood, I want to address some things:

- Chromebook vs. "Real Computer":  That's probably not fair.  It is a nice idea, but the idea of not having MS Office and a few other applications was a no-go for my mom.  Her choice, not mine.  She wanted an OS that still allowed her to install software she considered essential.  So that was the end of that.  For the truly platform-independent folks, it's a fine option.  I hope that's not "dismissive of anything non-Apple".  It just wasn't right for her.  Nor for me -- I want a CLI, Apache server, a C compiler, SSH....

- Devices and Drivers:  I did not have a device that "did not work on OS X."  I had a device that worked as soon as I plugged it in, it just never occurred to me that I wouldn't have to install a driver first.  So I looked around, Googled, and finally figured out that it was already up and running.  It was a USB modem, IIRC, so I didn't realize the OS had already enumerated it and added a serial port without any fanfare whatsoever.  This is before I started using Linux, so the concept was foreign.

- Windows, Devices, and Drivers:  In the days since dial-up Internet (referring to the USB modem example), Windows' stock driver support has improved dramatically.  No doubt.  However, all the way to the bitter end, XP still required AHCI drivers to be installed from a floppy disk in Drive A: if you were going to get your SATA hard drive to show up during installation.  Starting with Vista, that nonesense also improved...  However, even now, driver installation and device enumeration in Windows is archaic compared to OS X.  Heck, in many ways, even Linux is easier to deal with -- although the kernel API/ABI issues, coupled with the idea of compiling source code or dealing with "packages" may be too much for many users.  Still, modprobe is a pretty nifty little tool -- as is having drivers compiled in, so you have all your essential hardware working as long as you can manage to get the kernel extracted into RAM.  But, this is getting more subjective, and I digress...

- My Illiterate Mother:  I hesitate to even bother here.  You're calling my mother illiterate and saying things like "can't cope with a PC", so I think any hopes of civilized discussion have gone straight out the window.  Suffice it to say, she "coped" for years just fine, but things in PC-land require more maintenance than she really wanted to bother with -- and I have enough on my plate without having to add "desktop support" to the list.  So she has a Mac.  No anti-virus software constantly asking for updates for $25 a year, no Windows Updates, no Acrobat Reader.  Things just work.  And if they don't, and she needs an answer before I can come visiting, the Genius Bar can help her out anywhere she happens to be.  It's a good system, while still allowing her to have MS Office and whatever else she wants.

- Eject Buttons:  Yes, an eject button would be nice.  It went away because the bare disc slot looked prettier.  Yes, that's superficial... but now that's gone too, and we're pretty much left with thumb drives which SHOULD be unmounted with Safely Remove Hardware or the Eject menu.  Pick your poison.

- X Buttons:  If you read my example, I was talking about Win 3.0.  There was no X button.  There were two gray boxes -- one with a down arrow, one with an up arrow.  What do those do?  Well, as an eight year old (or whatever I was at the time) who had only used what passed for a GUI on a Commodore 64 before that, I had no idea.  It's not a steep learning curve, but it does exist.  At any rate, bubbles or icons, it really doesn't matter.  It's a lesson you learn within five minutes.  (Or you don't, in which case, the entire experience is likely to be terrifying -- and good luck finding an OS where that isn't the case.)

Now, say what you will about Retina and any number of other things.  "Retina" is just a handy synonym with "high resolution display" now.  I already acknowledged they didn't create the screen.  But again, I think it needs to be appreciated how they take these things and make them ubiquitous.  You tell me the iPhone 4 barely qualifies as high resolution.  If I look reeeaaallly close, I can indeed see pixels.  Compared to every phone I had ever seen to that point (including every Blackberry released, since I was involved in distributing them at my company at the time), the iPhone 4 screen looked brilliant.  Now, it's just "not bad" -- because what's out now is better.  Now they're releasing computers with high-DPI screens.  While the MacBook certainly wasn't the first, it was just not common to do so before.  It's getting there now.  Would it have happened without Apple?  Maybe so... but they seem to always be there, doing it full-scale, before anyone else -- except for those few rare examples with nowhere near the mass appeal.  They have inertia, and they start positive trends.  (They also use snooty advertisements.  So, granted, it's not all perfect.)

This is way longer than I intended it to be (it always is), but one more thing:  Look at the transition between PowerPC and x86.  The writing was on the wall, and Apple had to shift their entire platform to a new architecture.  So, they concocted the Universal Binary thing -- which is a fancy way of saying "it's compiled for both", so far as I know.  Not exactly revolutionary technically, but simple, elegant, and effective.  As such, they managed to pull the PowerPC carpet out from underneath the whole ecosystem, and start using a whole other platform the very next model year.  And there was very little to-do about any of it.  Well planned, well executed, and as seamless as I believe that kind of thing possibly could be.  (They even had a mostly transparent binary translator available for years afterward to help cover the stragglers.  I mean, geez...)

It's 2014 now, and I feel like we're almost.... almost... done with x86-32 on Windows.  That is a battle MS has been fighting since Windows XP 64-bit Edition, continuing through Vista, then Windows 7, and Windows 8.  Still, though, there are two versions of Internet Explorer on my 64-bit PC.  It has been like ten years now, and they haven't yet completed the transition.  We'll see whether they continue to release a 32-bit edition of Windows 10.  (If they do, I really don't know what to say...)  That right there sums up the difference between PC and Mac to me.  It would be fair to argue that MS places more emphasis on backward compatibility, and gives you more choice.  The price for that is ten years of driver hassles and parallel Flash plugins, and so on.  Not at all simple, not at all elegant.  But you do have a choice at least, whether it makes sense or not.

So, in summary:  Android is a great thing for openness, at the expense of fragmentation and exposure to malicious code.  Chromebook is a great attempt at Computer-as-an-Appliance, but it does little more than a tablet while retaining the form factor of a traditional computer.  That might be a plus, depending on who you ask.  Oddly enough, maybe the Surface has the right idea there -- although, they're only starting to "get it" in terms of software with Windows 10.  Still too early to tell, and I'm not holding my breath.  Meanwhile, the iPad made the world take tablets seriously despite countless failed attempts before it.  And I'll admit, I laughed at it too when it came out.  I'm still unsure about the Watch -- I kinda feel like it's a solution looking for a problem.  But in five years, I'll probably look back in awe at how it changed the landscape, and even I didn't see it coming.  And I'll probably be called a fan-boy, despite initially thinking it was a silly idea.   :-//
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #30 on: October 10, 2014, 08:30:35 pm »
I hear similar stories from many Apple fans. It's the classic "I was sceptical, but then I tried it and was amazed."

So, you're telling me that people who use and like Apple products come in one of two flavors:  Either they like everything Apple does w/o reason (i.e., they're sheep), or they initially have no fixed opinion (except perhaps skepticism at the hype) and then decide they like it.  But, neither is genuine...  :-//  Is there any correct way to like an Apple product, or is it solely the domain of Kool-Aid drinkers then?

It's probably pointless, but this is why I bother to argue about it.  I'm not trying to "convert the masses to Apple".  I like the fact there are competitors.  Heck, I might jump sides at any time.  What I don't think is fair, is the stigma, and automatic labeling of anyone carrying an Apple product and/or talking about how they like it.  No explanation, no preference, is ever enough to be justifiable.  Ultimately, I like their products, and I am impressed by the design and engineering.  That's really all there is to it.  I don't see how that's different than having a preference for any other brand or product.  What else is there?

Take the iPod. Everything before was larger not because Apple has better designers or engineers, but because Hitachi hadn't invented and begin mass producing 1.8" hard drives yet. Similarly, no-one else used a click wheel because Synaptec hadn't invented them yet. At best you could say Apple waited until those things existed, rather than actually innovating themselves.

At best... well... no, that's actually the part I'm intrigued by.  They know when to say no.  It's a very common trend with Apple.  When the tech is ready -- that is, when the user experience can be one where the technology is not in the way -- then they proceed.  There's far less concern about being in the market first.  There's a lot of wisdom in that approach, and I admire their willingness to do it.  Obviously it works for them.

You're probably going to spin this as:  "So they're good at not releasing a product until someone else invents it?  Wow, impressive."  But I think you see the difference.  The click wheel wasn't necessary.  The 1.8" hard drive wasn't necessary.  The LCD wasn't necessary.  iTunes wasn't necessary.  There were plenty of other methods they could've used to bring a device to market.  But, the combined whole was a successful product while the industry is littered with other players that nobody remembers.  (No one posts weekly Riocasts, for instance.)

I have a 3rd gen iPod in a drawer somewhere, and it sucks. The interface is clunky, iTunes is a complete pain in the arse. The LCD is awful, I could hardly believe such an expensive high end product came with a display that make the cheapest Nokia look good. It only allows data transfer over Firewire, and the sound quality is extremely average. It only became half decent when I installed Rockbox on it.

Maybe.  I owned the iPod Video, which was a great little player.  The LCD by today's standards is nothing to write home about, but for the time, something that could play video in the palm of your hand was pretty exciting.  (Although I'm sure you have five other examples of devices that did the same thing.  Nonetheless, it was quite neat, and I used it a lot.)  The audio quality was as good as any portable CD player I had ever owned.

In honesty, I'm not a huge fan of iTunes.  It was an OK music librarian, but it has been shoe-horned into a multi-purpose product that has outgrown its usefulness, IMO.  I didn't mind using it to sync my iPod back in the day, but for my iPhone, I touch it as little as possible.  I mostly use it as a WAV to AAC conversion tool.

Similarly with the early iPhones, my friends got them but I wasn't impressed. Yeah, there were some stupid apps where you could pretend to drink beer out of your phone or some some silly game, but Android was giving us real productivity tools to do things like wifi surveys or a swipe keyboard. The iOS app store felt like one of those mid 90s shovelware CDs, full of lame shareware games and some wallpapers the guy found on the internet.

Of course early apps were primitive.  It was a new platform.  Android came out later, with lessons learned from iOS, and had the chance to get all the things Apple did wrong, right.  Some of those things were genuine improvements (like giving developers more freedom to tap the hardware), some weren't (like giving developers the freedom to hack the users' device).  Consequences for everything.

For me, one of the most exciting devices post-iPhone, was the one that had an available dock to give you peripheral I/O (including a full-size screen) where you could use your phone as a makeshift laptop.  Unfortunately, the dock was no cheaper than buying a netbook, and the capabilities of the device were less impressive than said netbook, so it failed spectacularly.  That's the mistake Apple doesn't often make.

Incidentally, I'm still waiting for a phone that will transform into a full Linux laptop with the addition of a screen, KB, and mouse.  When one comes around that is realistically usable in such a configuration... "Apple?  Never heard of them."

Fair enough, but there are laptops that weigh less, are just as thin, have better hardware and cost far, far less than an Air if you want to run Linux. It's like you want a digital watch, but instead of buying the most functional one for a reasonable price you decided to get a nice fashion accessory. Sure, it is also functional and tells time perfectly well, but you paid a lot for the logo.

You might say that.  People said the same when I used to buy Sony laptops.  In reality, I had the same reasons.  They were small, light, and had a nice trackpad (I hated using Toshibas, and my current Dell trackpad has been largely ignored in favor of the less chaotic nubby thing in the keyboard).  Nonetheless, I got a lot of "brand snob" comments from people about that too.  Whatever.  Aesthetically, I like the Air.  Functionally, I give up some performance for the size (which is a compromise I've made as long as I've been buying laptops), lose the ability to swap batteries (which I'm not crazy about), and trade the second mouse button for a well-executed multi-touch pad.  Out of all those things, it's the last that has kept me from actually buying an Air for myself.  Incidentally, I stuck one of those Apple stickers that came with all the iStuff on my Dell when my SO got her Air.  It was for a joke picture we took and posted on her FB page.  I left it because I found it amusing.  My point being, I have the logo and paid nothing extra for it.

You say that, but don't explain how. I can't really see how it is any different to MacOS. You plug the device in, it quietly finds and installs the driver for you in the background, and you get a little notification to say it's ready. If anything it's better than MacOS because you get the notification, and don't waste ages searching the internet for a solution to a problem that doesn't exist.

OK, you got your cheap shot in.  Feel better?  Seriously man, you say you're interested in knowing why I think what I do, then you turn what I say into an attack against me (or my mother.)  Should I be poking fun at you for not knowing that colored bubbles are clickable?  No, it's a reasonable criticism, if not one that I share.  When I first held a mouse, I didn't understand the semantics of the Windows desktop.  When I first used OS X, I didn't understand it wasn't a manual process to install drivers.  I shoulder the blame for those cognitive errors.  That's fine with me.

Anyway... my biggest problem with Windows driver installation is the elaborate process.  (No, it's not "too hard for me" it's just more work than it should be.)  On OS X, you can drop a KEXT into the appropriate folder.  On Linux, you can modprobe your .ko file.  Done.  Device supported.  To undo, delete the KEXT or use rmmod.

On Windows, you have the .inf files, the drivers themselves, the catalog files, Device Manager, the installation wizard (next next next next next, specify a folder, next next next next, yes I accept the unsigned driver, next next next next next finish).  To undo, you might have to resort to registry cleaning, removing files from the bowels of \Windows\System (or the equivalent depending on platform), etc.  Yes, usually it's just an uninstall EXE.  Unless that breaks, is no longer available due to a network or automated install, the vendor doesn't supply one, or things got hosed somewhere.  Then it gets ugly, fast.

This one may be a matter of preference, but I'll give you another example that I find irritating.  I have a few FTDI and Prolific USB serial cables.  On Windows, the device name I get (COMx) depends on which USB port I plug it into.  Naming persistence.  I guess it's designed so that you always get predictable names for devices you use.  In reality, it just assures that, with two Arduinos, a bunch of serial adapters, and a couple other devices that use internal USB-to-serial adapters, I have a COM12 port despite the fact that there are never more than two, maybe three, ports available at any given time.

On Linux, I get /dev/ttyUSB0 for the first one plugged in, USB1 for the next, etc.  OS X is similar.

Same for drives.  The whole "C:" thing is way overdue for extinction.  NTFS supports mounting to folders, why aren't we being pushed in that direction?  Or, why not start encouraging developers to use the device names instead?  (Although, they're arguably more obnoxious than the likes of /dev/sda1.)

I have more, but I think that's enough ranting to make my point.

So your argument is that usability of MacOS X compares favourably to Windows 3.0, and OS released in 1990. 24 years ago. A time when some computers didn't even ship with a mouse.

No, of course not, and you know that.  Although, the window management mechanism hasn't changed much since then anyway.  Minimize, maximize, we added a dedicated close button.  That's about it.  I was using Win 3.0 when I learned the modern GUI and that's all the significance there was to that statement.

It's a synonym for "low resolution display". The latest iPhone 6 retina display barely scrapes over the 720p mark, and the previous version couldn't even claim to do that.

No, it's not.  Maybe by those who are trying to be snarky to humor themselves and others who have an issue with the brand behind the name.  But, it's generally understood now that when someone says "Retina display", they're referring to a high-DPI screen that has sufficient resolution to render curves without obvious stepping.  Whether you agree that the DPI is sufficiently high to meet that criteria or not is irrelevant.  It's the same as when someone says "Xerox machine" to refer to a copier, and "Kleenex" to refer to a facial tissue.  The industry is already using Retina to refer to non-Apple products, but usually qualified as "Retina-like" or similar to be technically correct, since Retina is (or at least could be) a trademarked term.

One of the attractive things about Windows, from a business perspective, is that if you have an old Windows 95 app your business relies on you can still run it.

Some.  Some 95 apps might still run.  However, there are plenty of examples of XP apps that don't even run in Vista, 7, or 8.  Not to speak of drivers.  It's why their XP Mode thingy exists, and one of many benefits to VirtualBox and VMware.  If backward compatibility was as good as all that, there wouldn't be 486s and Pentiums still in service now because some essential device or app runs on them, and so they can't be upgraded.  FWIW, I don't blame this on MS.  It's a supportability issue that the vendor(s) should be addressing, but sometimes don't.  That's life.

It's nice to push forwards, I agree, but look at how many ordinary users were upset when they could no longer run legacy PPC apps on MacOS. A lot of people are stuck on... is it Leopard, the last one to have PPC support, because they need to run a particular app. The price of used machines capable of running that version of the OS is now getting silly on eBay, but if you are a professional and need it for work what choice do you have?

Obviously there are casualties.  You can't move to a completely different architecture with nary a squeak.  But, the compatibility they did maintain was exemplary.  And yeah, it was Snow Leopard or thereabouts where Rosetta was last included in the OS.  For comparison, when did Microsoft stop allowing (e.g.) Itanium cross-compatibility with x86?  Right, that never existed -- although, it's not really fair either since Itanium wasn't a consumer class platform.  You might argue PPC->x86 was more akin to x86->x64, although Apple has been through that as well.

Point being, the PPC transition may not have been totally and completely transparent in every single case, but it was pretty darn close.  They provided an elegant workaround for years, but at some point the onus is on the user and developers to take the next step.

A good case study of this is Blizzard, the game software company.  They compiled WarCraft III as a Universal Binary, so it works on PPC or x86 Macs.  They have not, in all these years, released a patch for the installer, though.  It is PPC only.  So, no WC3 on Lion unless you happen to have upgraded to Lion, with the game previously installed.  Boo, hiss, bad OS X!  Except, Starcraft 1 doesn't run correctly on Windows Vista+ either.  Changes in Direct X that broke the indexed palette graphics, mostly due to desktop effects compositing.  No patch for that either.  (Although, you can hack the registry to make it work well enough to play.)

Lesson being, backward compatibility is reasonably attainable for long enough to allow users to transition to a proper fix.  Hoping for anything beyond that is at your peril.  Ask the enterprise customers who are using XP with IE 6 because their ginormous business logic web app won't run on modern IE, Chrome, or Firefox.  In that light, it might actually be a favor to jettison legacy support so you don't lull your customers into expecting it to be there in ten years.
 

Offline Richard Crowley

  • Super Contributor
  • ***
  • Posts: 4319
  • Country: us
  • KJ7YLK
Re: OT "If iPhone 6 Were Actually Better"
« Reply #31 on: October 10, 2014, 09:07:24 pm »
  Ultimately, I like their products, and I am impressed by the design and engineering.  That's really all there is to it.  I don't see how that's different than having a preference for any other brand or product.  What else is there?
Really?  What other brand can you think of where people will stand in line around the block in inclement weather for days waiting for first-shot at a product nobody has tested or reviewed or even seen yet?  Sorry, but that is just sheep-like behavior.  So much so that competitors can make commercials about the folly that everybody relates to.  Maybe it is a character flaw, but I have a deep distrust of the popular and majority.

I just read a relavent quote in another discussion here in the EEVblog forum.  to wit:

Quote
There's an old saying about poker games. If you look around the table and can't tell who the "mark" is, then you are the "mark" :)
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #32 on: October 10, 2014, 09:26:56 pm »
There's a very real difference between people who have to be first to have the latest trendy product, and people who use the same product because it works for them.  Equating these two types of people is just as ridiculous as being one of the former.

I have successfully abstained from lining up at my nearest Apple Store for the iPhone 4S, 5, 5S, and 6.  I am due for an upgrade though, so if I happen to walk into a shop some time when there is something maybe a little less than a day's wait time, buy an iPhone 6, and heck... maybe even like the thing...  Well, it seems to me a reasonable action for a consumer to take, and not necessarily that of a sheep.  That is the only point I hope to make.

BTW, people also line up at Wal-Mart and Best Buy and any number of other retail chains to buy whatever is on sale the day after Thanksgiving.  In my climate, in late November, setting up camp at midnight for a few bucks off a new TV is not a particularly sane thing to do either.  And yet, without fail, every year...
 

Offline victor

  • Regular Contributor
  • *
  • Posts: 110
  • Country: 00
  • Boy who writes code and take things apart
    • vitim.us
Re: OT "If iPhone 6 Were Actually Better"
« Reply #33 on: October 15, 2014, 03:09:25 pm »

I would still like Forstall to come back and fix the mess that is IOS 7.  There are tales of political and personal issues, and I'm sure there's truth in all that, but after he left the rest of the crew just went ape-poo and changed the UI "because we can", without really improving much of anything.  And a lot of it is worse.  Revamps are OK.  Changing the skin, and introducing a ton of bugs and usability issues in the process, is not.

I totally agree, in some way apple is not following their own guidelines they stablished for developers and rejected apps in the past because they were inconsistent, or crashed. Now native apps have cluttered UI and CRASHES, iOS is not a smooth and polished platform anymore. I remember my experience in 2009 with iOS I'm trying to do anything to make my phone lag, and I was unable to.
your body is limited, but not your mind
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #34 on: October 15, 2014, 10:12:46 pm »
Yeah.  It's a huge revamp, and so I'm trying to be patient.  I have to keep in mind that no version 1.0 is ever going to be a shining example of consistency and stability.  But, having gone from iOS 6 on an older device, that still ran pretty well, to iOS 7, which ran like mud on the same device.... I was not pleased.  On my SO's iPhone 5, it runs like buttuh though.  I haven't tried iOS 8.  That's essentially version 2.0 of the face-lift, so if it isn't significantly better, I'm going to be really disappointed in Apple.
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #35 on: October 15, 2014, 10:13:05 pm »
No. I'm saying that the reality distortion field is strong. It's slowly fading now, but you can't deny it was quite impressive when Jobs would announce something everyone else had for ages and because it was Apple doing it all the fans who previously said they didn't want it were suddenly sold on the idea.

Yep, that happens -- not just for iOS, but for any platform for which there is a loyal fan-base.  For whatever reason, products tend to spawn zealots, and Apple's marketing milks that for everything it's worth.  Even with more reasonable folk, there are biases.  That is without question, and it is indeed irrational.  I'd like to say I have none, but I do... and so does everyone else.  Human nature.  What can ya do.

I'm just saying that you don't seem to objectively evaluate features or functionality because even when there are better alternatives you prefer the Apple version but can't state exactly why beyond a vague "I just prefer it".
Perhaps I just haven't asked directly enough. What, specifically, about the way the iPhone works do you prefer?

I am not likely to provide an answer that is going to satisfy you.  Take an iPhone and an Android phone side by side.  They both make calls, they both send/rcv email, they both run apps.  There is effectively zero difference in basic functionality.  The majority of the differences are in the user experience, and mobile devices are very personal.

You clearly value your freedom to share data between apps and get software from various sources.  Great.  I do (and have) admit that I would like that too, but I find that I'm rarely encumbered by those limitations in practical use, so I'm willing to accept them.  The flipside of that is how the OS "fits" my personal tastes.  There are a ton of little things that mean practically nothing in isolation -- lots of it being aesthetic or otherwise qualitative -- that add up to make iOS preferable to me.  A lot of it is even subconscious.

Take, for example, the behavior while scrolling.  You get to the end of a list, and iOS has that little bounce effect instead of stopping sharply.  Or, when you rotate the screen, the content pivots smoothly around the center instead of falling over like a block.  Back in the earlier days of Android, these things made iOS feel a lot more polished.  It's stupid and insignificant, but it's exactly the kind of thing our minds pick up on and react to in ways you can't think your way around.  It's as elemental to usability as learning to catch a ball and understanding the physics of gravity.  It feels right.  There's a reason Apple fought to keep those stupid effects proprietary.  I have no doubt they spent a lot of time tweaking that stuff.  The human reaction to that stuff has huge consequences.

The hardware is the same way.  The device feels nice to hold.  The voice quality has always been really good.  I've heard a few other phones that I thought sounded like crap.  Same with the speakerphone.  I've used Motorola phones and BlackBerries where the keys wore out.  I've had three iPhones now (two personal, one work) and they worked perfectly until I was ready to trade up for other reasons.  The screens look great.  Say what you will about Retina, but I don't see pixels unless I look really, really, really closely.  The color and contrast are very good.  So is the camera quality.  (All for its time, of course.)  In contrast, around 2010 when I was moving from a 3GS to an iPhone 4, every Android screen I saw looked bright and sharp, but a little oversaturated and had a bit of a green tint that I didn't like.

I don't remember if I already gave this example, but I recently sat down with a coworker that just moved here to show him where some place in town was located.  I couldn't figure out how to get into Street View on the Google Maps app.  Also an iPhone user, he echoed my frustration: "Oh I know.. the old Maps app was better.  Come to think of it, I can't remember how you get to Street View either."  We tried a minute or so, then gave up.  I just pointed him to some nearby landmarks that he knew.  This isn't crucial, and I can Google it or keep tapping around to try and find it, but (at least pre-iOS 7) that sort of thing just didn't happen.  I never had to learn how to use the native apps, they just made sense.  I'll accept learning curves for things that should take some effort, but this just isn't something that should take time to figure out.

Those things all matter, and add up a hundred little tiny stupid insignificant quirks and the end result is, I like the iPhone, and I have not, to this day, held an Android device that I like as much.  You don't have to agree here.  That's why Apple exists alongside Google and Samsung and HTC and ....  You don't even have to understand or think my reasons have any validity.  Just try and get that it has nothing to do with trendiness of the platform.  I really have no interest in following the pack.  I don't use Facebook, and the only name-brand clothes I own were popular 15 years ago.  No one is going to accuse me of being trendy -- until I mention I use an iPhone, and then it's "OMG sheep!"  ::)  It gets old.

Anyway, I'm sure the hardware landscape has changed dramatically in the last four years, but that was the last time I shopped for a phone, so aside from a few quick comparisons when an acquaintance got a new phone and offered a quick demo, that's all I have to go on.  I tell you this -- I was looking hard at the Atrix last time, but I ended up staying with Apple because the Android OS seemed a little rough and sloppy at the time, and the big selling point -- the laptop dock -- was too expensive to be a realistic alternative to just buying a netbook, which would've performed better anyway.

Now, I'm ready for a hardware upgrade again, and so I will hold one of the new enormous and pliable iPhone 6s and whatever comparable Android phones are there.  I might be won over by the competition, especially since I'm a little unsure about the new iOS, and the new big and thin 6/6+.  But it will have to actually win me over, as I see no point abandoning a platform I'm content with for a lateral move.

That said, Cyanogen looks interesting.  I don't know anyone running it, and since it's probably not available for demo at the local phone store, I dunno if I'll get around to giving it a fair shake this time around.  But I will definitely be keeping my eyes open.

It's not all roses though.  When I gave up my flip-phone for the 3GS, I was shocked to find that this new fancy thing couldn't move pictures and audio files back and forth via Bluetooth.  That has been resolved by other means (e.g. WiFi transfer), except that you just can't export stuff from the local media store (i.e., what you put on the phone from iTunes).  That might be an unavoidable condition of being highly visible in the media market.  I get around that by using 3rd party apps for transferring audio and video to and from the device.  NBD.

SD card storage would be nice.  Although with 64GB+, I can store my entire music library, all the apps I want, selected movies when I travel, and have enough room left over for pictures, video, email, and OS updates.  My current phone (32GB) is a bit limiting in this way, but we share an iPad with 64GB, and storage has never been a problem.  The iPhone 6 has 128GB available.  At that point, I don't think I would even use external storage if it were available.  I do not dispute that the high-capacity models are overpriced for the storage you get.  It's good to be king I guess...  I sigh, and roll my eyes, but ultimately, I can afford it and it's worth it to me.  When you think about it, $700 is a pretty cheap computer actually.

Next complaint:  I hate iTunes and the limitations on importing and exporting media.  I've gotten around that by encoding mobile versions of my movie collection (which I would do anyway -- I don't want to dedicate 20GB per film on a phone) that can sync via iTunes, or just using one of the many SMB-compatible apps to copy movies into local storage.  It's only accessible in that one app, vs. the more open file system you'll get on Android, which is a shame... but again, it doesn't affect my usage much.  There is limited sharing of data between apps, and hopefully they'll eventually loosen up and provide a little more filesystem visibility.  Until then, I have methods that work, so...  meh.

But that isn't what the iPhone has. The laptop displays are slightly better, but the crap font rendering in iOS doesn't help them.
(Quote)  It's generally understood now that when someone says "Retina display", they're referring to a high-DPI screen that has sufficient resolution to render curves without obvious stepping. ...  Whether you agree that the DPI is sufficiently high to meet that criteria or not is irrelevant.
It was your entire point in the previous sentence.

I have no idea what you're talking about WRT font rendering quality.  Looks fantastic to me.  Concerning the second point, Retina refers to high-DPI screens that do not visibly resolve pixels.  You're obviously free to disagree that the DPI is high enough.  That's still the point of the phrase, and I agree with it in practice.  I really have to strain to see pixels, so to my eyes, they're not resolvable from ordinary distances.  Could be you just have exceptional eyes.  Or it might just be your bias.  Whatever.  *shrug*  Higher DPI panels exist, so you're covered either way.

You mean the second gen iPod Photo? There was never any device called an "iPod Video" according to Wikipedia, but it could be wrong. I thought it was just the 2nd gen Photo though.

No.  On Wikipedia, it's labelled iPod 5th Gen.  I dunno if it was ever officially called "iPod Video", but it was colloquially known as such if nothing else.  Great little player.  It was a gift, since I had been reluctant to get an MP3 player... I hadn't found anything I liked.  I showed some interest in the iPod, so my SO bought one for me, and I l-o-v-e-d it.  Carried it everywhere, literally.  Permanent fixture in my pocket, along with a wallet and flip-phone.  That device changed my opinion of Apple.  I was a hardcore PC guy, and would've preferred something from Creative Labs (also really fond of the SoundBlaster line from years previous), but as I said before, their player was nothing compared to the iPod.

Interesting to hear about the older, inadequate analog output stage.  I dunno what to tell you.  Like I said, I got onboard pretty late, and the one I had sounded great to me.  I also watched the entire first season of BSG and Lost from that thing connected to a TV.  No complaints on quality whatsoever.  Video sure wouldn't hold up today, being SD and limited decoding capability.  But for audio playback, even with my Sennheiser and Sony over-ear headphones, it sounded good.  I sometimes did max out the volume, but it's a portable device.  I expect as much.  None of my Discmans or Walkmans (or clones thereof) ever blew my hair back in that regard either.  A few specialty players do have high-current output stages meant for low-impedance headphones, and I'm sure they sound slightly better.  But, good enough is good enough for a mobile audio device.  And the iPod sounded plenty good enough to me.

The key difference is that when Sony was making those machines no-one else offered anything comparable. Now the Air is basically okay but overpriced compared to the competition. I suppose if you must be iOS then it is your only legal option, but for Linux there are much better and cheaper machines out there. NEC's ultrabook range, for example, is just as thin but lighter, has better screens and more powerful hardware, and they are easy to maintain.

I haven't ever seen the NECs.  Saw a few Samsungs, and got a chance to use one of the larger ones.  It was OK, I remember hating the keyboard, but I don't recall why.  The later Sony stuff has a design aesthetic I don't like as much, and it got to the point where they were just as unrepairable and unupgradebale as an Apple device, so.. eh..  Using a Dell Latitude now because I got to keep it when I left my last company.  Too big, but it works.

In what way is "insert device, wait for driver to install" elaborate? .... That was true back in the Windows 98 days, but not any more. Even as long ago as XP, 12 years ago, it wasn't true.

It's not elaborate from a user point of view -- when it works.  But the mechanism for installation is overly complicated, and prone to error.  Yes, still.  MS and vendors both try to shield users from this to varying degrees of success.  I had the pleasure of being a desktop admin for a large company for a while.  Local (USB) printer drivers are the worst.  Then you see stuff like "DO NOT PLUG IN YOUR DEVICE until installation is finished!", and "This driver is not signed, continue anyway?", and so on.  I have personally had to sit down in front of a computer and download utilities from a device vendor to fix problems because the user made the mistake of plugging in the device first, and then trying to install the software on the included CD.  This is admittedly a relatively rare problem, but it possible because of a very poorly designed driver framework.  Part of my living has come from the fact that this is a truism.

In contrast, we had one remote site that used Macs exclusively, except for couple of Windows PCs we sent them for compatibility with certain business apps that we mandated they use.  For some reason, they also tended to use the PCs for email -- no idea why, since Office was available for Mac.  I think it had something to do with the local network there, but whatever.  Point being, we never heard a peep about the OS X machines.  Those user were completely self-sufficient unless they needed something (like new software, or a new copier, etc.)  It could be they were all just really smart and savvy, or it could be that OS X is good at letting the user do their thing w/o getting in the way.

Since this particular subsidiary was a graphics shop, let me harp on printer drivers for a moment more here because they truly are the worst.  Our IT director tried to come up with some sense of standardization (only PCL... no, only PostScript.. OK, except when the copier has a preferred native driver... OK, just do whatever you have to do to make the thing work) -- but to no avail.  Now, MS dominated (and still dominates) the business market.  That's an area they could've exerted some force and cleaned up the mess.  We never had to touch the OS X printer drivers though.

Similarly, Apple comes along with this little thing called AirPrint.  You walk in with an iOS device, see something on the network that speaks AirPrint, pick it by name, and print.  That's it, you're done.  There's NO set up ahead of time.  Now, it hasn't taken over PostScript exactly, but it is completely effortless on supported devices.  We had a bunch of executives carrying iPads after a while, and this made life really easy for us during annual meeting time.  We kept one printer in the conference room that spoke AirPrint, and our maintenance pretty much just ended with keeping it filled with paper and toner.

PostScript has the PPD specification to tell the host everything it needs to know about the printer to send it a job.  It's a small leap to work with manufacturers to include this in a printer's ROM and come up with a standard USB class driver to make thing whole thing seamless.  Similar functionality could be achieved over a network by standardized queries.  In other words, Microsoft could have, in its position of authority, pushed for AirPrint-like functionality ten years ago.  Or, a generic interface where clients print to the Windows print server, and only the server has printer drivers to worry about.  (Now, you provide drivers at the server, but the client still has to download and install them.)

This is a very specific example, but I hope it illustrates the kind of issues I have with the Windows driver ecosystem.

On a similar note, go to Google Images and compare "OS X System Preferences" to "Windows 7 Control Panel".  Now imagine you have to walk a customer or your parents through one or the other.  One of my first tech jobs was at a DSL help desk.  Win98's IP stack was clumsy, but at least the steps to get to network card and protocol settings were simple.  When XP came out, it was a little more work, but still bearable.  After I left, I talked to some folks still working there when Vista came out.  They HATED it.

When we got Mac calls, all you had to do was look at some screen shots once.  It's a direct route, and I don't remember if I ever got a call where it didn't work.  I walked a lot of Windows users through resetting the IP stack though.  Similar with field visits.  Lots of Mac homes, but the only reason I was there was to install the service.  I had no end of PC repair calls.  The numbers are skewed, since there were more PC users than Mac users, but I knew when I got to a site where they were using a Mac, I was going to be done in five minutes.  On a PC, I just expected I would need to find and install a driver, remove a dozen IE toolbars, run Windows Update to fix some known issue or other...
 

Offline tom66

  • Super Contributor
  • ***
  • Posts: 7014
  • Country: gb
  • Electronics Hobbyist & FPGA/Embedded Systems EE
Re: OT "If iPhone 6 Were Actually Better"
« Reply #36 on: October 16, 2014, 11:11:25 am »
Really?  What other brand can you think of where people will stand in line around the block in inclement weather for days waiting for first-shot at a product nobody has tested or reviewed or even seen yet?  Sorry, but that is just sheep-like behavior.  So much so that competitors can make commercials about the folly that everybody relates to.  Maybe it is a character flaw, but I have a deep distrust of the popular and majority.

I have seen such queues for new games consoles, new game releases, book releases (Harry Potter, Twilight Saga  :palm:, etc.),  and for Samsung's Galaxy android smartphones, it's not unique to Apple.
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #37 on: October 16, 2014, 07:50:52 pm »
Agreed. I think what it boils down to is that you are comfortable with iOS so put up with the reduced functionality and limitations. You complain about them, like not being able to move files around easily over Bluetooth or USB, but apparently the discomfort of having to adapt to Android is greater than the discomfort of having to look for hacky work-arounds.

No, you're just finding a way to state preference in a way that is worthy of contempt.  I'm being honest about the limitations that are there.  I would prefer they were not.  But the reality is, it's just not that big of a deal to me.

Imagine for a sec that iOS 8.1 comes out and Apple decides, everyone gets free access to the underlying FS.  OK, great.  How would this change my actual behavior?  Very little.  I have a couple apps that I like for various things.  I found a nice media player (basically a mobile port of VLC, IIRC) that can grab files over SMB, FTP, and might even support SCP.  It has a little file manager where you can manage the stuff you've downloaded.  Right now, it's restricted to just the content you got via that one app (well, it can also play from your iTunes library, it just can't export to it.)  If that were lifted, so I could share the content with other apps, I would still use the same app for playback because it does a good job at it.  So how much am I really suffering here?

Similar with the SD card thing.  Would be nice.  But I have an SD card slot in my laptop.  I use it mostly for building images for e.g. a RaspPi or BBB.  If I'm taking movies on a trip, I just copy them to the HDD.  On our iPad with 64GB, I have enough space to keep media locally too.  My phone is currently tight on space, but the next gen of hardware I buy will not be.  So, again... how much does this impact my actual usage?

In terms of moving data via Bluetooth, the first iPhone I had was missing that functionality, and it did suck.  Now, there are copious ways to achieve the same end result -- many of which have far better usability than what I had on my old Razr.  Looking back, the multi-step process of transferring single files with Bluetooth and having to confirm each item on the receiving device.. THAT'S the hacky way.  Good riddance.

I am comfortable with iOS, that's true.  But, it's not a fear of the unknown that keeps me from changing platforms, it's the fact that I'm not left wanting as-is.  So, why change?

It's interesting that you say there are lots of little things about iOS that you prefer, but they generally don't seem to be actual features, just aesthetic choices. The fact that it takes more effort to deal with notifications on iOS doesn't seem to matter, because it bounces when you scroll to the end of a list.

sigh... Yep.  That's totally the takeaway.   :palm:

You're intentionally dismissing the gist to make a point that my preferences are irrational.  Like I said.  I'm not going to convince you otherwise.  Believe what you want.  I gave you probably half a dozen examples, that add up to the fact that the iPhone and its OS feel comfortable.  Given a choice between two platforms that accomplish the SAME GOALS, I choose the one that feels nicest to use.  Is it shallow?  Sure.  That just means the basics are covered sufficiently, to the degree that the tie-breaker comes down to which one has the nicest paint job.

Wikipedia has a comparison of mobile OSes.  In terms of features, Android and iOS are about neck and neck.  Each has its minor victories.  On a whole, they're both feature-complete.  If you're looking for the Killer App on iOS that will resolve our difference of opinion, it doesn't exist.  If iOS never existed, and Android was all there was, I would be perfectly happy with it.

I have no idea what you're referring to with notifications.  Maybe I would appreciate Android's handling of... whatever... better, but right now I'm in no pain.  So.. *shrug*.  I guess I'm not compelled to search for greener grass.

How was it connected? Analogue video output?

Yep.  This was in the era of tube TVs.  Mine did have component inputs, but they were still limited to 480i.  So, I used a cable that broke out the headphone jack into L+R audio and composite RCA.

Video output on iOS devices is sub-standard. The Lightning connector does not support HDMI, like MHL over Micro USB. The Apple video adapter has an ARM processor that takes lossy, compressed 720p video from the device and converts it to HDMI, so the result is low resolution and over-compressed. It's kind of outrageous on such an expensive, supposedly well engineered device.

Interesting.  Well, you're right... it's obviously no replacement for a proper media player.  I've used the iPad on vacation before, with a 30-pin to VGA adapter to the hotel LCD TV, and it did the job.  I'm not going to sweat over compressed 720p from a phone, since I doubt I'll have content on the device that makes that the dominate bottleneck.

It comes down to real-world pain points again.  It would be swell if it were perfect, 60-fps 3D 4K HDMI, but the practical difference between that and what you actually get is virtually nil.  Sorry, I can't get worked up about this.

OTOH, if that whole phone-as-a-portable-Linux-box thing works out, I'll be sure to appreciate the legitimate HDMI interface.

How is that any different from MacOS? Surely it doesn't let you install random, unsigned code in the kernel? That would be a vast security hole. I don't know if it lets you do enterprise wide driver deployment for things like printers. If you are getting warnings about drivers not being signed for printers from major vendors you should be really worried.

If you haven't seen warnings about unsigned code from major vendors, you haven't installed much hardware.  That happens with regularity.

It seems like you had some issues with crap hardware/software on Windows, and assume those problems can't happen on Mac OS. I can't see how that follows, since clearly people can write crap software for Mac OS, sell crap hardware that is compatible with it, and there are security warnings if they don't bother to sign their driver code. I have a feeling you probably paid more for the Mac compatible hardware anyway, as the really cheap crap doesn't bother supporting it, so maybe that's why you had fewer problems.

Assumptions again.  In actuality, the last printer I bought for home had AirPrint support, which I didn't notice until after I set it up.  I wasn't involved much in the shopping this time... the model we got was Her decision.  She uses paper more than I do, and I just don't care that much.  I can tell you for sure AirPrint wasn't the feature that sold us -- that would be duplexing. scanning, and network connectivity.  Nonetheless, it makes it super convenient to (e.g.) get a PDF via email and send directly to the printer, so I appreciate it being there.

As for the business stuff, ask your IT department about their own experience with multifunction copiers.  It's pretty much universally miserable.  At my last job, I've installed Ricoh, Xerox, Canon, Toshiba, HP, Dell...  I suppose those are all crap vendors, and the one you use is obviously the way to go, so I'll concede it's probably my fault, or Apple's fault, or something like that...

Anyway, it's an example, but not really the point.  The plug-n-play experience on Mac OS is beautiful.  Given the choice between dragging a KEXT to a folder and having the device just work; or pointing a wizard to the proper .INF file, selecting my exact model from a list (why is that even a step?  Every modern bus has unique device IDs...), going through a copy process and (optionally) allowing unsigned drivers, then closing out of the wizard, and device properties window, and Device Manager...  Who really prefers the Windows way?

And when the driver already exists (like something both systems support natively), where Windows is at top form, try plugging a new device into a PC and a Mac.  Like, a mouse for instance.  On Mac, you plug it in, wiggle it, and it works.  On PC, you plug it in, "New Hardware Found", wait a sec, "Your device is installed and ready to use"... wiggle it, and it works.  At best it's equal, but it usually takes a few seconds more.  Obviously that's not a huge deal, but the overhead of all the registry changes and INF parsing is still there.  Proof of an ugly, laborious process that isn't necessary and only causes more trouble when something breaks.

It's really the kind of thing you just have to experience for yourself to appreciate, in that it so rarely begs of your time and attention, and you start to notice how nice it is that you don't have to deal with that stuff anymore.

With a Hackintosh, the experience is mixed... The OS isn't meant to run on commodity hardware, so you have to be a little more careful about the system you choose.  Windows has the advantage there, hands-down, in terms of broad compatibility.  However, if you have two motherboards with hardware that will run OS X (and it's getting easier to meet that requirement, both due to Apple's hardware choices and the work of the community that provides the core workaround drivers) you can pluck the drive from one and boot up the other and... it just works... on the first boot, like nothing had changed at all.  Same as Linux, for example.  To be fair, I haven't tried this on Win 7 yet -- I have a new mobo on the bench so I will soon -- but if the past is any indication, there will be some nursing involved, if not a complete reinstall.  We'll see.
 

Offline SirNick

  • Frequent Contributor
  • **
  • Posts: 589
Re: OT "If iPhone 6 Were Actually Better"
« Reply #38 on: October 17, 2014, 06:38:18 pm »
You can't even avoid using the language of Apple marketing. You see it copied by all sorts of people now. Things are always described as "beautiful", "surprising", "just works" etc. It's functional and reliable, not beautiful in any conventional sense of the word.

OK, I think I'm done here.  If this is going to turn into an anti-Apple rant, then there's really nothing left to debate.  I like their stuff because it works well and the UX is polished.  I value those things, and the products I use (both Apple and otherwise) tend to mirror that.

"That's beautiful" is a phrase I use to describe a lot of things I appreciate for some reason or another.  I'm pretty sure it didn't come from an Apple commercial, but hey, what do I know?  I'm just a sheep, so I only think what Apple tells me I can. :scared:

Glad you found a platform you like.  Would really be nice if you would allow others their preferences, without turning up your nose and carrying on about their obvious ignorance.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf