General > General Technical Chat
New PC build p0rn photos.
NiHaoMike:
--- Quote from: paulca on April 14, 2022, 09:38:50 am ---Server != Workstation != Gaming Desktop.
They all have different requirements and workloads. The former two are typically tuned for large parallel loads. The later for heavy single threaded load and real time graphical framerate production. A server might run a TitanX or A5000 card to run 64 parallel encoder and FX threads on the GPU rendering a 4K video. A task which can be predicted, planned, optimized and then executed. A gaming PC hands it a stream of objects, light sources etc. etc. etc. in realtime expecting a frame back in under 20ms and sustain that for hours at 100 frames per second.
When benchmarked, high end servers and workstations tend to out perform gaming PCs on pure parallel compute tasks, especially when they have multiple CPUs and GPUs. Video editing, timeline scrubbing, rendering and CAD, CAM, aero/hydro etc. etc. simulations run better. But when it comes to running games on server and workstation hardware, they struggle. They tend to aim for efficiency and parallelism rather than balls to the wall 100% caned to get those last 10 FPS in live rendering. So they struggle and underperform.
That's not to say you can't pimp and mod a server board to play games. But it wouldn't be cheap. (Linus Tech Tips did build a machine with 8 GPUs in it on a server board, ran 8 VMs, assigned each a GPU and got 8 gamers to play simultaneously on it. I believe they then tried to show how hard it was to cool with ridiculous aircon units and duct fans etc. etc. Sure they were playable, but not exactly out performing a standalone rig... and the cost was eye watering).
Turning a rack server into a gaming rig would be very much like taking a Ford Transit and completely refitting it with a Jaquar XJ220 engine and creating a race car out of it.
To extend that further.
Server = Truck.
Workstation = Van
Desktop = Hatchback
Gaming PC = Sports car
Sure the van will carry more stuff and do the whole job faster, but it's just not as much fun.
--- End quote ---
What hardware do professional flight simulators run on? I doubt they're just putting gaming PCs into those. My guess would be rack servers with GPUs and some sort of high speed backplane like Infiniband.
Back when I just graduated from high school, I was shopping for a laptop to use for college. I narrowed down the choices to 2 Dells - a gaming model as well as a workstation. The only real differences between the two were that IEEE1394 was standard on the workstation but optional on the gaming and the gaming had a whole rainbow of color choices while the workstation is "any color you like as long as it's flat black". I went with the workstation since I used IEEE1394 as a cheap high speed LAN back then.
--- Quote from: themadhippy on April 14, 2022, 11:59:00 am ---After many years and ££££ i gave up trying to maintain a gaming pc and just bought a games console.No more chasing the latest graphics card or trying to tweak the last bit of performance whilst putting up with the noise of multiple fans,now the computer sits,almost silently doing everything it needs to.My last console cost less than the new motherboard/proccesor /power supply /case upgrade loop i would need to complete to play the latest games i wanted at a decent speed. An added advantage is the abundance of second hand games available for the console.
--- End quote ---
And a level of user-hostile "security" that would make an iPad look like an open platform. For that matter, might M1 Macs become the new "console"?
Berni:
Yep my desktop machine is pretty old.
It is a i7 4790K with 32GB of DDR3 with that GTX 1070. So this is a 8 year old CPU at this point. But thanks to AMD being out of the picture and Intel doing nothing because of it that CPU is not that horribly slow by todays standards(just the 4 cores is a bit few these days). It does the job fine. So unless you really want to be on the bleeding edge you don't actually need to upgrade a PC all that often (Amusing you buy a decently fast PC to begin with).
As for gaming. I am not that serious of a gamer, just something i do here or there when i don't feel like doing work. I don't tend to play the latest AAA tiles right out of the gate, if they are very demanding i am fine with dropping graphics settings lower, my screens are only 1080p anyway, i don't have VR..etc so that GTX 1070 is still serving me well.
I also don't really care for physical copies of games. They are more of a hassle to buy, have to keep them somewhere and get the disc when you want to play it. My i7 machine never even had a optical drive (the DVD drive on IDE from the old PC could not work with the new motherboard, so i just didn't put one in and never missed it). So i like Steam where i can just buy a game in seconds with a single click, download it and play it minutes later. There is no draconian DRM with it. I can install Steam on multiple PCs and just install any game with a click. Once authenticated the games also work in offline mode without internet. Sure my game collection would be gone if Steam goes out of business, but that is pretty unlikely to happen any time soon.
--- Quote from: NiHaoMike on April 15, 2022, 12:13:11 am ---What hardware do professional flight simulators run on? I doubt they're just putting gaming PCs into those. My guess would be rack servers with GPUs and some sort of high speed backplane like Infiniband.
--- End quote ---
You can build a FAA certified flight training simulator using X-Plane 11. It just takes special licenses for all of it, as well as certified hardware for the controls. As usual in aviation the hard part is the certification and being certified does not actually mean its particularly good, it just means it was signed off by a bunch of people as being good enough.
So you basically just need a stack of "gaming PCs" to run it. But by the time you spend the money on just the software licenses you are already in the 6 digit figures, so you might as well also spend some extra money on more enterprise grade offerings from NVidia like the Quadro GPUs and a proper server board. But it is still just a PC.
The more fancy Nvidia enterprise solutions where they pack >10kW of GPUs in a single server case, those tend to be specialized for compute and so don't have a video output connector. Those instead get used for doing raytrace rendering of CGI movies, AI training, weather simulations etc...
paulca:
I think the solution integrators like Dell, HP, et. al. That sell high end hardware that isn't for gaming, tend to target their markets towards corporates. Johny and his gaming PC going off line because he remounted the radiator upside down is not a big deal. Having a customer have to pay £200 a hour for a technician in a far away country to physcially pull the server and repair it, while it has down time = big deal.
Corporates are willing to pay way over premium if the system comes with on-site support and warranty. Corporates want to know they can have Dell/HP/whomever at their datacentre the next morning with an identical replacement. So when you spent £3500 on a basic 1U rack server based on "commodity hardware" you are really only getting about £2000 worth of kit and are paying the £1500 premium for that "Enterprise grade" warranty and support... which a gamer who only has 1 to look after and it sits right beside him isn't interested in. It's just not worth paying for support when you built it yourself! Just replace the parts yourself.
I remember doing computer hardware specs for companies several times. We basically broke it down into categories desktop/office PC/Laptops, General workstation, Specialist workstation, Server. But all that really meant was the general workstation had lots of RAM, a bigger CPU and 2 monitors (see software engineer or electronics designer). The specialist workstations where those that came with titles like "CAD" or "CAM", they tended to have a defacto application which you just fed the best specced parts to meet its needs. There was no consultation. The customer just ordered X desktops, Y workstations, Z cad or cam machines. They trusted we specced them fine, they trusted we would support them, etc. etc. They didn't even ask about specs, they just assumed we knew what we were doing and to be honest, they were running 1995 hardware in 2005 (Compact Presario servers etc). It wasn't going to be hard to impress them. The desktops ended up being MicroATX boards with onboard video and 1G RAM, in Windows XP days. I scooped myself a nice Presario server for my network, including SCSI RAID array (4Gb, LOL) and a 4Gb tape drive.... which ended up in the bin not that long after.
paulca:
Server side, things scale immensely these days.
In work we are migrating from one data-warehouse solution (IBM) to another (Apache Hadoop, Cloudera stack).
I ran a query the other day, nothing special, just count the number of rows in the primary partitions. Shrug. The number of rows as impressive, but no where near as impressive as how long the query took. It gave me my answer in 5 minutes. What made my eyes go wide however was the total CPU time for the query was 4 days.
I pulled up the cluster stats and the number of cores was over 4,000, total available memory around 16 Terrabyte and disk space measured in Petabytes. That was just our quota/segment of the cluster and SAN.
An individual edge node, proxy, agent box ... basically a standard server ... has a spec featuring 128 Cores and 1Tb or RAM. They share a highspeed RAID NAS redundant pair of 100Tb for "local" storage.
Mostly they are integrated rack / blade systems. Big storage array and power supplies at the bottom, then blade after blade after blade of processor/memory units.
Behind all this is layers and layers upon layers of virtualisation. So you could assign all those resources to one "machine" or you could carve it up statically or dynamically into a dozen lesser dev or application servers for example.
I think they have moved a lot of actual "host nodes" into the AWS and Azure clouds and seemlessly integrate them into the virtualisation/logical layouts.
It's pretty insane, tier 1 enterprise stuff. My view into that stuff is quiet limited, I don't even know the full extent of that data cluster for production. The UAT/QA system alone is massive and for production there are 2 exact copies of the entire cluster as redundant hot/hot load sharing and resilience.
tom66:
Well, I'll go against the sentiment, nice build :)
And who cares about the RGB lights - it's your PC, you can make it look like however you want. Personally I chose components with no built in lighting (exception being the stock AMD cooler) because my PC sits out of direct view of my desk. But if it was on show I could see myself putting some lights in there.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version