Products > Computers

W11 - Worth upgrading from W10 ?

<< < (16/17) > >>

bd139:

--- Quote from: Mechatrommer on June 13, 2022, 02:14:48 pm ---maybe you can ask people at NASA or CERN about which PC/laptop is suitable for their particle physic modeling/prediction etc.

--- End quote ---

LOL

Nasa. Curosity team...



Also the researchers tend to use MacBooks at CERN and the place is absolutely crawling with iMacs. They only place they don't really feature is on compute clusters and admin stuff. But it's all Unix and the Macs are Unix...

Same in fintech these days as well.

Edit: don't forget that Tim at CERN's old www box was a NeXTcube, which is basically what the current line of Macs are (Mach + NeXTstep + some evolution).

Reality is you can go to ANY high street these days and buy a multi-core RISC 64-bit certified Unix machine descended from the NeXTcube that is used at CERN, NASA etc and ships with a full suite of compiler tools and a full Unix command line that runs rings around most other things on the market on performance and everything on the market for power consumption and battery life and everything that has ever existed on quality.

Mechatrommer:

--- Quote from: bd139 on June 13, 2022, 02:37:49 pm ---Nasa. Curosity team...
--- End quote ---
not the socialized office people again! :palm: those are not personal owned laptops, those are company's assets, suited for the purpose ;D

tooki:

--- Quote from: Mechatrommer on June 13, 2022, 02:14:48 pm ---
--- Quote from: tooki on June 13, 2022, 11:21:16 am ---As for comparing a desktop PC to a Mac laptop: very few PC laptops have upgradeable graphics cards. Laptops, regardless of platform, tend to have highly customized motherboards, and while the Mac has embraced fully-integrated, completely non-customizable/upgradeable motherboards, the PC world is going the same direction, just a few years later (as always...)

--- End quote ---
just to be clear, my comments earlier are on any laptop in general, not particularly on mac brand. macbook is just a high end version of a laptop. any laptop doesnt have upgradable graphics card, thats what i was talking about, one of the thing.

i dont consider any EDA such as Keil or Microchip Studio as "demanding" tools, even the Photoshop or Altium Designer, they are in "lightweight" to "middleweight" range. AutoCAD or other 3D CAD or renderers start to demand on GPU power, CPU power for custom render methods that the GPU cant do efficiently. things got heavy when you play with engineering solvers and FEA/FEM analysis tools. and i heard people talking about large scale SW development that the compiler will use to the last bit of your CPU power to do compilations of many code files, i dont think Keil is one of them. probably in sci-fi film production too. maybe you can ask people at NASA or CERN about which PC/laptop is suitable for their particle physic modeling/prediction etc. and there is no amount of processing power on earth that is "enough" if you want to do a custom O(n^x) operation with x >= 2, n = extremely large value, even for a small single function of code. ymmv.

--- End quote ---
Dude.  :palm: Nobody is disputing the existence of users with truly high-end needs. But your original statement, made to exemplify the need for upgradability, was that anyone who chooses to "upgrade" their "skill set" will need upgradability. And that's complete and utter nonsense. The hardware demands of a piece of software have no relation to how far someone goes in their career! A programmer, no matter how skilled, will never need cutting-edge hardware to do their job. (OK, if you're going to compile the entire Windows source code, it's handy, but it's not just your work then.) Meanwhile, anyone doing 8k video editing, no matter how low-skilled they may be (even someone straight out of school at a film studio), is going to need a beast of a machine with massive storage infrastructure. It has nothing to do with skill and all to do with the requirements of the job. And realistically, very, VERY few things these days require high-end hardware. The vast majority of workers, regardless of skill level, have minimal-to-modest system requirements, and won't EVER upgrade their computer, they'll just replace it.

I remember when computers were so slow (very early 1990s) that a standard part of benchmarking in computer magazines was recalculating large Lotus 1-2-3 or Excel spreadsheets, because the difference between an entry level machine and a top of the line one could be 45 seconds vs. 15 seconds. That's a real, palpable difference. Only a few years later, they removed the "recalculate" command altogether because it was something that could be done in real time. It used to be that photography was an extremely high-end application, now it's something that (at least when implemented with some brains) is happy on almost anything: graphic artists no longer need to be as high up on the performance scale as they once did. When I started my career in IT, professional standard-definition video editing required machines with tons of specialized hardware and massive disk arrays (The video never even passed through the CPU: the computer was simply the host and user interface for dedicated hardware). SD (and HD) video editing is something modern systems can do in their sleep.

I also think it's important to point out that games have basically redefined what high-end means. GPU development isn't driven by high-end professional graphics (engineering workstations, etc) any more, it's driven by games. A GPU that's modest for modern games is, frankly, quite capable for a whole lot of engineering software. (The main difference between "pro" cards like the Quadro line is the drivers, which are optimized for accuracy over speed, whereas the gaming drivers are optimized for speed at the expense of accuracy. As an aside, on the Mac, there's never been this distinction, with the drivers all more or less corresponding to the pro drivers on Windows. That's why software that demands a pro GPU on Windows is perfectly happy with the equivalent gaming GPU on the Mac.) It used to be SGI pushing GPU capabilities forward for money-is-no-object level of high end applications. Less than a decade after $250,000 each SGI Onyx systems were needed to create the liquid metal in Terminator 2, SGI was moribund and Maya, the descendant of the software used to make that film, was running happily on Mac OS X and Windows NT on commodity hardware, thanks to games pushing GPU development forward at a tremendous pace.

Similarly, the growth of standard computer performance is why the high-end UNIX workstations (SGI, Sun, etc.) all died out. Mainstream hardware caught up with them, so there was no longer any justification to spend the enormous R&D costs to design those things.

Bicurico:
After using Windows 11 for some time, I still don't like it. Windows 10 is more user friendly, while 11 is a pain in some aspects. Also, I have not found one single nice new feature.
But the upgrade is a must, due to security updates which will cease on Windows 10, when it gets obsolete.
The only thing one can do is to delay the upgrade until then.
It pisses me off that most of my power computing hardware is not eligible due to older CPU or lack of TPM 2.0.

austfox:
I put together a new computer for home use and installed a fresh copy of W11. About 4 hours in and all my programs were installed, drives setup the way I like them, desktops customised, and the whole thing just works without any issue.

W11 was a free upgrade from W10... W10 was a free upgrade from W7... and I've been around long enough to remember when DOS was AUD $150, Windows 95 for around the same price, and OS/2 was well over the $200 mark (maybe half the weekly wage back then).

I realise the money to be made nowadays is in data collection and App purchases, but since W11 has been working 'out of the box' for me, I can't complain.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version