Author Topic: Proofreading my computer design and building a server rack station.  (Read 50565 times)

0 Members and 1 Guest are viewing this topic.

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
To anyone just joining in, ignore this first post as it is basically irrelevant now. I would skip to around page 8-9 upwards...


I am trying to build a high-end computer system for my business. I've selected all of the components I would like to use and checked them all for compatibility, and everything seems to be in working order (except for the case, which I'll explain afterward), but I would like another set of eyes to go over it.

I'll mainly be using the computer for music/video production, and animation products. I'll be running professional programs like Adobe Premiere/Flash/After Effects, so I needed a very high end computer set up.

I would also like anyone else's opinions on what could be better alternative components, or would further the performance of what I would like to do if you see anything lacking in the components I have chosen.

Here is my computer check list.

Motherboard - MSI X99A GODLIKE GAMING CARBON
Graphics card - EVGA GeForce GTX 1080
Processor - Intel core i7 6850k
RAM - Kingston Technology 64GB RAM kit 2133mhz DIMM drx4
SSD x1 - Intel Single Pack 400GB 750 Series Solid State Drive PCIE Full Height 3.0 20NM MLC 3.5"
HDD x4 - WD Black 2TB Performance Desktop Hard Disk Drive - 7200 RPM SATA 6 Gb/s 64MB Cache 3.5 Inch
Network Card - Intel PRO/1000 Pt Dual Port Server Adapter
Cooling fans x? - Cooler Master Hyper 212 Plus - CPU Cooler with 4 Direct Contact Heat Pipes
Power Supply - EVGA SuperNOVA 1000 G2, 80+ GOLD 1000W
Blu-ray/DVD drive - ASUS External 12X Blu-Ray Burner with USB 3.0

Everything is supposed to be included in that list (other than the case), so if anything is missing that this computer won't work without, please let me know.

As for the case. At first I was looking into the Thermaltake tower 900 E-ATX, which I believe would house this build appropriately. However, I started looking into server racks instead, because they're much bigger than normal towers, and they can be used to house additional equipment as well. I would like to house external 5.1 receivers for stereo equipment, Blu-ray/DVD drives, DACs, and preamps, possibly even additional computers (or multiple of this build for instance) sometime in the future.

The main problem with this is that I know little about server racks and the like, so I could use some additional help choosing the right equipment.

The research I've been doing for the last four hours or so has led me to the knowledge of needing additional pieces to assemble such a system. Pieces like shelves, rails and screws, and something like this... https://www.amazon.com/Rosewill-Server-Chassis-Rackmount-Metal/dp/B0056OUTBK?th=1 ... for the actual building of the computer inside of it, right? Or could I do it some other way I'm not understanding?

I'm alright with buying just the frame/skeleton of the server rack, without the panels and the doors if necessary. I just need the actual server rack to be compatible with an E-ATX sized build, and to be compatible with the additional pieces, like the shelves and rails.

So I would really appreciate a simplistic walkthrough of how to build a server rack for the following purposes, along with possible suggestions for every piece that would be needed to build it?

-the above computer build.
-a 5.1 receiver for a speaker system.
-External Blu-ray/DVD drives.
-DACs and Preamps.
-Possibly future duplicates of the above computer build.

I'm looking to buy a 42U server rack as well.

I am continuing to research this myself as I wait for replies so I can better understand what I'm doing and the possible suggestions from my fellow users.

Please do not be concerned about price, my budget is rather flexible for this project right now. I would like to attempt to keep this server rack case under $600, as I'm already looking at $4000 at the least, but if it's not possible, then it's just not possible, and I can break that budget.

Anybody who can assist me as I work on this project, I would very greatly appreciate it. I understand the computer builds and how to put one together, but the server rack thing is really confusing to me, and good information is difficult for me to find on the internet for some reason, so that's the main thing I need assistance with.

Again, thank you to everybody who is willing to help me build this workstation. I'm hoping to have it finished by the spring. ;- )
« Last Edit: January 28, 2017, 07:26:04 pm by Lizzie_Jo_Computers_11 »
 

Offline PTR_1275

  • Frequent Contributor
  • **
  • Posts: 561
  • Country: au
Re: Proofreading my computer design and building a server rack station.
« Reply #1 on: January 01, 2017, 05:45:50 am »
Chenbro is a good brand for rackmount of cases. They do several (quite a lot actually) different models to suit what you need. I have one of their 4RU cases and it's great. You will need to put the other items in the rack separately, whether by their own rackmount ears if they are rackmountable, or by fitting shelves and just putting the amplifiers etc... on their own shelf.
 

Offline radar_macgyver

  • Frequent Contributor
  • **
  • Posts: 698
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #2 on: January 01, 2017, 05:46:36 am »
Would suggest a Xeon CPU and motherboard, and ECC memory. With 64 GB memory you're more likely to get random bit flips that could potentially lead to crashes. Supermicro makes some very good Xeon boards and barebones. Their barebones would solve the server rack compatibility issue.

I'm confused why you'd need rack mounting for a workstation.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #3 on: January 01, 2017, 12:21:49 pm »
I'm confused why you'd need rack mounting for a workstation.

The reasons are clearly stated in the first post. :-)

8TB of storage sound as a lot, but over time that will run out, and some stuff need to be archived. This is a good time to add a rack mounted storage server into the build.

If you want a really high-end computer for professional music/video editing then I'd go for a Xeon/Quadro build. Those components are designed to work well with professional software from Adobe and other big companies. But the price jumps up drastically. If that is a too big jump then the components you chose are just fine.

If you are doing some 32-64 channel music with a lot of effects, then a CPU with more cores will perform a lot better when playing tracks in real time. The overall experience will be much better. For final music rendering purposes any number of cores will be just fine. Takes just longer time to get the final tunes spit out to the HDD.

Building everything in to a rack tower is a great idea, because you get all the cabling hidden inside the rack. As long as you make sure it has proper ventilation, or you blow up all your devices eventually. Stick the hottest stuff on top and coolest on bottom.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9890
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #4 on: January 01, 2017, 05:13:28 pm »
The other day, I googled for a dual Xeon build and came up with this:
http://www.techspot.com/review/1155-affordable-dual-xeon-pc/

32 threads seems like a lot.  Note the performance at various tasks and see if it will be adequate.  FWIW, those Xeon E5-2670 are fairly cheap at this point:
https://www.amazon.com/Intel-E5-2670-2-60Ghz-8-Core-Processor/dp/B007H29FRS/ref=pd_sbs_147_t_0?_encoding=UTF8&psc=1&refRID=0PR50MD6EQDK0GEZC29H

It seems that a lot of these chips are 'pulls' from servers.  Brand new chips are still pretty pricey.

It's fun to note that Win 10 will support 256 cores, presumably 512 threads.

 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #5 on: January 01, 2017, 08:32:59 pm »
yes 32 threads sound like a lot but when you have hundreds of things that has to be calculated fast and in sync, 32 threads on a CPU is not that much anymore. GPUs use thousands of threads/cores.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #6 on: January 02, 2017, 01:36:33 am »
slicendice: Thank you very much. Those are some great suggestions. A rack mounted storage server was something I had just run into during my continued research, so I think your right, I should start looking into that. Any good suggestions? I'll bring my own in a little while.

When I google Xeon processors, I get 8 core prices at around $100-$200, which is an extremely big price drop? Is it lacking greatly in something that the i7 8650k has, or am I missing something here?

About the mounting rack, I had an idea, I might try and build one myself rather than spend the $1000+ on buying one. I could customize it's exact size and specifications and save at least $500 doing so.

I was thinking about something else, and if it sounds like a really bad idea, please warn me of negative consequences. I was thinking that if I build the whole rack myself, I would build the PC system directly into it. Sectioning off about a 1/3 of the station as a makeshift computer case. It would have a massive amount of space inside, allowing adequate breathing room for any heat it would start giving off.

Speaking of the cooling system, I was wondering how good of an idea this was. I'm not sure if this is even a logical idea right now. I have absolutely no problem with fan noise, as an even louder AC unit would likely be constantly drowning out that sound anyway, so I was looking into buying a large box fan to install in my customized server rack. Something by Lasko or the like. Mounting/screwing it in as one big fan. Would blowing outward of the makeshift PC case section, or directly into the case pushing cool air on the components be better? Or is this destructive somehow? I could also set up two fans, one front and one back if necessary.

rstofer: Thank you for that article, it was informative, and quite helpful to break that what I'm getting into with a Xeon CPU processor. I just can't understand why the prices are so low, if they would work even greater for what I'm doing, I'd figure they'd be much more expensive than my pre-decided i7?


Something else I was completely forgetting to mention, which was the reason for that motherboard was that the computer will also be used for video game creation (more than actually playing), possibly using high-end programs such as Unreal Engine 4/Unity, and programs requiring lots of rendering power such as those again, and programs like Blender.

Will the Xeon build still be sufficient for that as well?

Besides that, I could build a separate computer for the game creation, if it makes difficult building this into one single PC, but I'd much prefer it all as one high performance machine. Although if building it as one would greatly diminish it's capability, I would move towards two, but I thought I would mention this.

I'll be back on later, I have to go look up more Xeon processors/motherboards, and start looking into the memory thing as well.

Thank you so much everyone! ;- )


EDIT:

I've continued looking into Xeon processors, and have seen very, very cheap ones, to $2000 ones appearing in results.

I was reading that article provided above, and started looking at the possibility of getting this one: https://www.amazon.com/Intel-E5-2670-2-60Ghz-8-Core-Processor/dp/B007H29FRS

It's very cheap, so that is what worries me about it. I thought (in the case I continue to use the above MSI board) that I would buy two of them to install in the mother board, which would bring the cores up to 16, which is significantly better than what I was doing yesterday. At least I think it is. I can't see anything that it is lacking in opposed to the i7?  Only the Ghz it would seem.

I have also been looking at more RAM. I was indeed looking into ECC, although I don't know, and haven't found a conclusive answer if it is compatible with the MSI motherboard. I found a few sets of 128GB RAM. What do you all think, would any of these work out better?

http://www.nextwarehouse.com/item/?2260912_g10e

https://www.serversupply.com/products/part_search/pid_query.asp?pid=279079

The latter ones are refurbished, however.

Also, I was looking at the server racks for the HDDs, and they are quite expensive, even for as little as 16 bays. At least $400 have been the average.


EDIT 2: LGA2011-v3 is the socket for the MSI mother board, is that other Xeon CPU I listed going to fit into that? It's socket is FCLGA2011?
« Last Edit: January 02, 2017, 02:06:19 am by Lizzie_Jo_Computers_11 »
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #7 on: January 02, 2017, 02:07:27 am »
I was reading that article provided above, and started looking at the possibility of getting this one: https://www.amazon.com/Intel-E5-2670-2-60Ghz-8-Core-Processor/dp/B007H29FRS

This is an old, Sandy Bridge based CPU.

The Xeon equivalent to your suggested i7 6850K is an E5-1650 v4. They're only very slightly more expensive, but you will need an ECC capable board for it to be worthwhile (and ECC RAM).

On that note, you may want to consider a more appropriate board either way - a name like 'GODLIKE GAMING CARBON' does not inspire confidence. Thankfully, proper boards aren't nearly as expensive as that... thing, and you won't need the extra network cards either. Look at, say, the Supermicro X10SRA. There are also options with SAS controllers should you desire some real storage performance onboard.


Be wary of heatsink height, the Hyper 212 will most likely be very tight in 4U ( Heatsink is 159mm, 4U is 177.8mm max - subtract ~2mm each for top and bottom plates, 4-6mm standoff, 1.5-2mm PCB and you're down to ~166mm - how tall are LGA2011 sockets with CPUs, again?).


Quote
About the mounting rack, I had an idea, I might try and build one myself rather than spend the $1000+ on buying one. I could customize it's exact size and specifications and save at least $500 doing so.

What's your location? You can regularly walk off with them for at most a couple hundred bucks, usually with odds and ends (shelves, power strips, etc) attached, if you can collect. If not, well, <$200 gets you an aluminium flat-pack..
 

Offline cdev

  • Super Contributor
  • ***
  • !
  • Posts: 7350
  • Country: 00
Re: Proofreading my computer design and building a server rack station.
« Reply #8 on: January 02, 2017, 02:09:17 am »
Did you figure out the power consumption/cost? Some CPUs use less power per unit of computing than others.

That energy gets turned into heat. Computers may reduce your heating cost in winter.

Also,  The "Lack" IKEA table, ODDA night table and OPPLI media stand also are sized so they can hold rack mount equipment.
« Last Edit: January 02, 2017, 02:17:08 am by cdev »
"What the large print giveth, the small print taketh away."
 

Offline GreyWoolfe

  • Supporter
  • ****
  • Posts: 3651
  • Country: us
  • NW0LF
Re: Proofreading my computer design and building a server rack station.
« Reply #9 on: January 02, 2017, 02:31:16 am »
No real need to pay $1000 for a server rack.  I scoured eBay and CraigsList a few years ago for my ham radio club.  We needed one fairly quickly for one of our repeater sites and I scored a 42U rack with all panels and doors installed and even the keys to lock it.  We paid $300 because we didn't want any of the power strips as we were installing a UPS that had all the outlets we needed. 
"Heaven has been described as the place that once you get there all the dogs you ever loved run up to greet you."
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #10 on: January 02, 2017, 02:36:31 am »
Thank you for your reply Monkeh. I was starting to think the same thing about the Motherboard, actually.

What about this board? https://www.amazon.com/Memory-2011-3-Motherboard-Z10PE-D8-WS/dp/B00O1AXIHM

It's a dual CPU, which might be capable of hosting the two Xeons, right?

The old Xeon I posted, is it not a good deal, even with two of them?

I'm going to go look into the board you suggested as well, thank you.

I'm in the United States, and all of the racks I'm looking at are far and above the $300 mark. I think I'm going to make one myself, but can anyone direct me to where I may find these cheaper ones?
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #11 on: January 02, 2017, 03:57:12 am »
Thank you for your reply Monkeh. I was starting to think the same thing about the Motherboard, actually.

What about this board? https://www.amazon.com/Memory-2011-3-Motherboard-Z10PE-D8-WS/dp/B00O1AXIHM

It's a dual CPU, which might be capable of hosting the two Xeons, right?

Sure, that'll take two current generation Xeons. But again, you can save a couple hundred bucks getting something not made for gamers.

Quote
The old Xeon I posted, is it not a good deal, even with two of them?

If you're looking to spend this much on a machine, don't even think of getting three generation old parts.

Quote
I'm in the United States, and all of the racks I'm looking at are far and above the $300 mark. I think I'm going to make one myself, but can anyone direct me to where I may find these cheaper ones?

I know you're in the US - where, exactly? ~50 mile area covers it.. You can easily find second hand racks on eBay or a multitude of other sources, if you happen to live near somewhere which would have them.

You can get the alu flatpack examples trivially: http://www.ebay.com/itm/42U-4-Post-Open-Frame-Server-Data-Rack-19-Adjustable-Depth-25-37-/152319524887

These absolutely require stiffening and shelves or rails for heavier gear, but those are cheap.
« Last Edit: January 02, 2017, 03:59:16 am by Monkeh »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #12 on: January 02, 2017, 04:15:32 am »
Monkeh: Thank you again! Alright, so definitely not those CPUs. I think I'll go for two of the ones you posted, the E5-1650 v4? Unless I can find a better one. If anything, are there any other even better ones for around $1000, two of them that can do what the ones you posted can and then some. If I know I'm going to spend some money of CPU then, I may as well get something that is definitely going to be worth it.

Besides that, do you have any suggestions for a motherboard then, besides the one I said. I didn't think that was made for gamers, though. It said it was a dual CPU board, which, generally speaking isn't something that's going to improve gaming, really.

I'm going to keep looking myself, and then come back, but some other people's suggestions are always welcome, in fact, I could obviously use some assistance.

I'm in Pennsylvania. I'm going to start looking again at what I can find, though.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #13 on: January 02, 2017, 04:41:48 am »
Monkeh: Thank you again! Alright, so definitely not those CPUs. I think I'll go for two of the ones you posted, the E5-1650 v4?

Those are single-socket CPUs only, you need the 2643 for dual socket.

Quote
Unless I can find a better one. If anything, are there any other even better ones for around $1000, two of them that can do what the ones you posted can and then some. If I know I'm going to spend some money of CPU then, I may as well get something that is definitely going to be worth it.

You need to seriously consider just how much power you think you need. You seem to be intent on spending enough money to buy a car, here.

Quote
I'm in Pennsylvania. I'm going to start looking again at what I can find, though.

PA is 280 miles across and 160 tall, that doesn't narrow things down much.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #14 on: January 02, 2017, 04:54:15 am »
Well, this is not for a "hobby thing", this is for my professional business that requires running top of the line programs such as Adobe Premiere/After Effects, and being able to render up to 4K complicated video projects at high speeds, meaning I would like it to take less than all day to finish a project.

Besides that, I need it to be able to process 64 channel VSTs in professional grade DAWs like Pro Tools, and again be able to efficiently export these projects. Often times, I'm working on multiple projects at once, and I need this PC build to be able to handle the things that I'll be doing, not only at a "It can do it" level, but at the level where it can do these things with minimal fuss, and at the quickest possible speeds.

Therefore, I need a lot of power with this computer.

As far as the rack is concerned, I've decided to build a highly customized one by myself. Building the PC directly inside of it, instead of worrying about the different parts and pieces and where to find them. I'm fully confident in my actual building and designing ability, so it just makes a lot more sense, and it can be exactly the way I want it to be.

Anyone's input on my Lasko fan ideas from earlier?

EDIT: The 2643 seems to be a 4 core processor (according to Amazon). I was aiming for to dual 8 cores, bringing it up to 16 cores.

EDIT 2: So will two of the Intel Xeon E5-1650 work with this motherboard, or is that what you were saying? That it wouldn't work putting two of the 8 cores in this dual cpu?

EDIT 3: Never mind, I already found the answer, it won't work. The search led me back to the intel Xeon E5-2670. Could someone (in detail), explain to me the cons of choosing this outdated processor? In terms of how it would affect the performance of what I'm trying to do? That would be really, really appreciated.

EDIT 4: Would this... http://www.nextwarehouse.com/item/?2260912_g10e ... be compatible with this... https://www.amazon.com/Memory-2011-3-Motherboard-Z10PE-D8-WS/dp/B00O1AXIHM ...?

From what I've researched, I think it should, I've found no issues with Kingston and Asus, but again, I'm proofreading my work. Thank you all.
« Last Edit: January 02, 2017, 06:23:30 am by Lizzie_Jo_Computers_11 »
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Proofreading my computer design and building a server rack station.
« Reply #15 on: January 02, 2017, 06:29:16 am »
You need to make sure your software works with that many cores. It's often not the case, then you can have 100 cores in the system, but workload would be only on 4 of them.  ;)
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #16 on: January 02, 2017, 08:07:23 am »
EDIT: The 2643 seems to be a 4 core processor (according to Amazon). I was aiming for to dual 8 cores, bringing it up to 16 cores.

The E5-2643 v4 is a 6-core processor just like the i7 you initially specified. If you want 8-core, that would be an E5-2667 v4.

Sandy Bridge era processors are not compatible with modern boards. They are five years old - put them out of your mind.

And no, you cannot use that RAM with that motherboard - dual socket motherboards must use registered RAM.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #17 on: January 02, 2017, 08:17:38 am »
What music software will you be using? A dual Xeon may be overkill, but at least you can add a lot of RAM, which is important when using a lot of channels and effects, and you get the advantage of ECC memory which helps preventing data corruptions, which is also important when composing music.

Adobe Premiere works best with a CPU that has about 10 Cores, but if you use GPU acceleration then the speed increases drastically for 4K video, and you can scale up by adding a second GPU if ever needed even more acceleration. :-) Having 30 cores will not help at all in Premiere.

For Unity games development you can never have too many cores. If the game is coded right you can utilize all of them. but more important is the GPU as for graphics calculations CPUs suck big time. I would also suggest you look into Unreal Engine, it's fast and it's free (used to be $100k-250k for a single license, don't remember the exact figure but it was in 100K's), including source-code, and you only pay once you make a lot of money from your game. Uses C/C++, while Unity is best coded using C#.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #18 on: January 02, 2017, 08:42:39 am »
FWIW I use a dual E5-2670v1 for video editing. I bought the whole thing in a 1U Supermicro rack chassis including 96GB ECC DDR3 RAM for £650 off eBay. This particular unit has one PCIe slot, which, together with its 650W PSU, limits the graphics card you can use with it, assuming you are going to be using it as a workstation. Cinebench score is a tad over 2000. It will almost render 4k videos in real time.

I use an AMD FirePro W4100 which will run up to three monitors, all at 4k 60Hz, and is single slot and relatively low power, so no additional PCIe power to contend with. It's useless for GPU rendering but why would you use GPU rendering when you have 16 cores/32 threads?

I upgraded the chassis from 96GB to 192GB ECC DDR3 for $150, not because I needed to, but because you can!

Another upgrade (which failed) was to upgrade the processors to E5-2696v2 which are 12 cores each, at $250 each. The BIOS in my mobo didn't recognise these processors despite it being advertised by Supermicro as supporting v1 and v2 E5-2600 series Xeons.

If you are using them as workstations in a rack, be aware that they're really noisy beasts. In addition, if you're running video from them you'll be lucky to get a 5m run to work at 4k 60Hz: I've achieved this by converting from DP 1.2 to HDMI 2.0 at the card and running Amazon Basics 5m HDMI cables to the HDMI 2.0 monitors. There doesn't seem to be much of a market for long DP cables.

My experience with Broadwell-E such as the 6800K is that they don't overclock at all well compared to Haswell-E such as the equivalent 5820K. I get significantly better performance overclocking on the 5820K, but overclcoking might not be your bag. Otherwise the 6800K is only marginally better than a 5820K at stock speeds, perhaps 5% on a really good day. Whether you need the 6850K over a 6800K is up to you, I didn't consider it worth the price for a few more PCIe lanes and a minor speed increment.

Edit: I am aware that many software products don't scale over many cores, but I would add that if you do much effects processing, many of these don't even touch the GPU. I would also add that it's unlikely you're going to leave your multicore beast sitting there just doing a render, you can of course continue to use it for plenty of other things with all that grunt. However thing I would say that if you use h.265/HEVC rendering, Skylake and Broadwell-E support 8 bit natively but you will need Kaby Lake for 10 bit HDR. This is where a modern GPU like the NVidia 1000 series will help, as they do support 10 bit HEVC assuming your software takes advantage of it.

A dual socket E5-2670v1 can play 4k HEVC despite not having hardware acceleration, but it is working pretty hard, perhaps 30-40% CPU. As a comparison, a single core Skylake or Braodwell-E will be running at barely 2% CPU playing the same media as it takes advantage of the hardware acceleration.
« Last Edit: January 02, 2017, 10:03:38 am by Howardlong »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #19 on: January 02, 2017, 12:01:14 pm »
It's useless for GPU rendering but why would you use GPU rendering when you have 16 cores/32 threads?

Because in some cases GPU rendering can be up to 1000x faster (and even more) than any CPU can ever achieve. GPUs are designed to process graphics data, CPUs are designed for general purpose computation. Where CPUs suck, usually GPUs rock, and vice versa.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #20 on: January 02, 2017, 12:55:04 pm »
It's useless for GPU rendering but why would you use GPU rendering when you have 16 cores/32 threads?

Because in some cases GPU rendering can be up to 1000x faster (and even more) than any CPU can ever achieve. GPUs are designed to process graphics data, CPUs are designed for general purpose computation. Where CPUs suck, usually GPUs rock, and vice versa.

Citation on 1000x please, plus please also see my edit.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #21 on: January 02, 2017, 01:45:37 pm »
Yes, it all really depends on the software. Some encoders are heavily optimized for multicore CPU and GPU accelerated rendering, and you can see the results in speed, required passes, size and final quality. Adobe software suck at video encoding. One would think that such expensive software would contain some proprietary algorithms that outperform some open source encoders in every aspect, but this is not the case. :-)

Either way we are way off topic now.

After detailed defining what the computer will be used for and what exact software will be used on that device, than it will be much easier to determine what hardware will be required.

If on Adobe software only, then 10 CPU cores is enough, 32-64GB of RAM is enough, and some mid range Gaming GPU will help in most cases at least a bit.

ECC memory is always a good thing when producing video and audio at professional level.

If any advanced 3D rendering software will be used or any CAD software will be used where 3D realtime view is a must then a Radeon FirePro or nVidia Quadro GPU is a must. They ensure compatibility, has a lot more GRAM, generally are faster on such software than Gaming GPUs due to optimized drivers, and most high end Pro grade GPUs has ECC memory, which I already stated to be quite important. Price could be an issue though.

The motherboard should have a 3-5year warranty. Meaning the components used (especially caps and voltage regulators) are top grade. Many Asus MBs have this and I highly recommend Asus MBs from the professional line. Does not cost that much either, except for the Best of the best MBs with almost all features a MB can have.

If producing music with a software that supports hardware accelerated audio processing, then getting a compatible soundcard/HWencoder/HWdecoder that supports on the fly post processing effects is better than having tons of CPU cores. The cards are way faster and the CPU pretty much sleep while listening to multitrack audio in realtime with a lot of effects and filters applied. Rendering all the effects on CPU only, very quickly ends up in distorted sound or other artifacts.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #22 on: January 03, 2017, 04:28:52 am »
Monkeh: Thank you very much again. I'll be checking out the E5-2667 v4, while also looking into a 10-core as slicendice was suggesting. I understand where I was confused about what you said earlier, though.

Alright then, those processors are completely out of my mind then, and off the table, thank you for clarifying.

I even checked a compatibility list, I must have missed something. Thank you again, I appreciate it.

Howardlong: Thank you so very much for that detailed and well thought out reply. You obviously put time into your posts.

Quote
If you are using them as workstations in a rack, be aware that they're really noisy beasts. In addition, if you're running video from them you'll be lucky to get a 5m run to work at 4k 60Hz: I've achieved this by converting from DP 1.2 to HDMI 2.0 at the card and running Amazon Basics 5m HDMI cables to the HDMI 2.0 monitors. There doesn't seem to be much of a market for long DP cables.

Thank you for your input on this. Finding a decent way to display in 4K or decent 1080p for that matter, has been a complicated situation trying to use longer cords for my set up. Noise is definitely not a problem though. The room where this will all be operated has a huge AC that will be running the most of the time which will drown out the loudest of fans. I'm also thinking of by large box fans and hooking them up directly into my custom built rack for cooling, so they'll be producing noise anyway.

Quote
I am aware that many software products don't scale over many cores, but I would add that if you do much effects processing, many of these don't even touch the GPU. I would also add that it's unlikely you're going to leave your multicore beast sitting there just doing a render, you can of course continue to use it for plenty of other things with all that grunt. However thing I would say that if you use h.265/HEVC rendering, Skylake and Broadwell-E support 8 bit natively but you will need Kaby Lake for 10 bit HDR. This is where a modern GPU like the NVidia 1000 series will help, as they do support 10 bit HEVC assuming your software takes advantage of it.

Good to know with the cores, I was quite hoping they would allow me to do more than just sit there waiting for a render to finish.

Would this EVGA GeForce GTX 1080 GPU be sufficient? Based on the kind of model you listed? Or could I do better than that?

slicendice: Thank you very much again. You've been rather helpful this whole time, and your posts very clear and detailed. Thank you for taking the time out to write this all out for me, I appreciate it a lot.

Quote
If on Adobe software only, then 10 CPU cores is enough, 32-64GB of RAM is enough, and some mid range Gaming GPU will help in most cases at least a bit.

ECC memory is always a good thing when producing video and audio at professional level.

If any advanced 3D rendering software will be used or any CAD software will be used where 3D realtime view is a must then a Radeon FirePro or nVidia Quadro GPU is a must. They ensure compatibility, has a lot more GRAM, generally are faster on such software than Gaming GPUs due to optimized drivers, and most high end Pro grade GPUs has ECC memory, which I already stated to be quite important. Price could be an issue though.

The motherboard should have a 3-5year warranty. Meaning the components used (especially caps and voltage regulators) are top grade. Many Asus MBs have this and I highly recommend Asus MBs from the professional line. Does not cost that much either, except for the Best of the best MBs with almost all features a MB can have.

If producing music with a software that supports hardware accelerated audio processing, then getting a compatible soundcard/HWencoder/HWdecoder that supports on the fly post processing effects is better than having tons of CPU cores. The cards are way faster and the CPU pretty much sleep while listening to multitrack audio in realtime with a lot of effects and filters applied. Rendering all the effects on CPU only, very quickly ends up in distorted sound or other artifacts.

Everything you said in that last post was very helpful for breaking down exactly what I might be looking at here. I'll be getting to more on that in a moment, but figuring out the performance difference between CPU/GPU/Sound card is invaluable for this machine to be built properly.

Quote
Either way we are way off topic now.

No need to worry about that, it was actually informative for me, and quite good for my note taking, so not too off-topic.

Quote
After detailed defining what the computer will be used for and what exact software will be used on that device, than it will be much easier to determine what hardware will be required.

I probably should've done that a lot sooner, shouldn't I have?

I believe I've compiled a complete list here. If I notice something missing, I'll add it on to this list.

Everything is pretty much evened out as far as priority goes, so balancing this out is a good idea, but I'd rather push the envelope farther, rather than compromise between pieces. This is meant to be a long term computer investment that will be used day in and day out for professional grade production, so I'm comfortable with spending in the range of $5000-$7000 if it is absolutely necessary, but lets definitely try and figure out the best choices overall. I don't want to under or over spend for absolutely no reason.

In the Adobe department, I'll be heavily reliant on...

Adobe Premiere Pro CS6 (4K/2K editing, very complicated projects, with a lot of effects, etc.)
Adobe After Effects CS6 (I create a lot of logos, such as YouTube intros for clients with a lot of effects and things piling up in a project)
Adobe Flash Pro CS6 (Animation projects, a lot of card board animation, but as well it's used as a stand in for my other program RETAS, which is a Japanese frame by frame program meant for anime projects, when I'm doing more Japanese style animations, which get complicated as far as a lot of high quality images at 24 fps.)
Adobe Illustrator CS6 (Used regularly in conjunction with After Effects/Flash/RETAS, as most of the pre lay work is done in here.)
Adobe Photoshop CS6 (Besides the usual photo editing done in this program, it fills in for whatever illustrator may lack in, so it's used less than Illustrator, but still often.)
Adobe InDesign CS6 (Laying out book designs quite often, the usual things you do with these programs.)
Adobe Dream Weaver CS6 (I do actually build websites quite frequently, but I don't host a lot of them from my own computer, but I will likely have at least one likely massive website running from here.)

Very expensive programs... That's mostly in order of who will be used most often.

In the music production department...

Reaper 5 is my primary DAW where I do basically all of my music editing.

Omnisphere 2 is one of my most frequently used synthesizers, and it utilizes 16 channels (generally 8 ) and is one of the biggest issues as far as music production I'm having with slow play back, especially in real time.

I'm also using Komplete's Kontakt 5, and other high end instrument VSTs.

There is a good amount of effects processing done throughout here.

On my current system, it simply isn't possible to run even two of any of these programs together, and one of them alone takes up the entire CPU. Basically, I have all of the programs, and nothing to run them on. At first, I was figuring I could make due while I wait to save enough money, but the money put into the programs is a waste if I don't have a powerful computer to run them.

I believe I've covered everything, if something else comes the mind, beside internet browsing, which does need to be very high speed, but that is left to my router/modem combo, I'll add it to this post, or in my next post if it's not something too important.

Thank you again to everyone, you've been really helpful. I've been jumping between researching, a job, all of these projects, and reading up on your posts and computers in general, while doing my usual around the house routines, so your help with all of this is extremely helpful and time saving, so its a real life saver everyone! ;- )

EDIT: I forgot blender, and yes real time 3D modeling capability is definitely a must. As far as the NVidia Quadros, I'm seeing prices from $800-$4000.
« Last Edit: January 03, 2017, 05:15:33 am by Lizzie_Jo_Computers_11 »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #23 on: January 03, 2017, 08:08:45 am »
For the GPU and CPU part I think you have it covered. :-)

Now, for the audio part:

I looked a bit on Audio interfaces and such...For external MIDI and other Real Time instruments, I can not be that helpful. But I took a look at what sound cards Image-Line recommends for their Fruity Loops product (have been using it quite a lot, uses MIDI, VST etc...). They state in their FAQ that the $120 SoundBlaster Z is a really good card and recommended, and they also state that it might be a surprise that they recommend a consumer level sound card over a professional grade sound card.

Most important is that the card has a good Audio processor, which most SB cards has, and second but not less important is that the drivers support ASIO, which most SB cards also have. (ASIO enables the software to communicate directly with the card, skipping Windows completely, which improves performance and lowers latency to almost zero)

There also exist an upgraded SB Z card ( SoundBlaster ZxR ) which is even better.

Hope this info help you choose a proper card for audio processing and real time monitoring.

If someone else has some better recommendations, feel free to correct me on this one as I have no experience with the latest Audio Software and Hardware.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #24 on: January 03, 2017, 08:11:53 am »
I'll just briefly chime in on the audio issue before I sleep: For the love of all that is holy, unholy, unrelated to holiness, and just general existence: Do not buy a Creative card. They can't write stable drivers or firmware to save their lives and even more than that, they don't care.
« Last Edit: January 03, 2017, 08:14:12 am by Monkeh »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #25 on: January 03, 2017, 08:14:00 am »
I'll just time in on the audio issue: For the love of all that is holy, unholy, unrelated to holiness, and just general existence: Do not buy a Creative card. They can't write stable drivers or firmware to save their lives and even more than that, they don't care.

Oh, I've never ever had any issues with SB. But a Google search could solve this conflict in our opinion. I'd better get started...GOOGLING
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #26 on: January 03, 2017, 08:16:52 am »
I'll just time in on the audio issue: For the love of all that is holy, unholy, unrelated to holiness, and just general existence: Do not buy a Creative card. They can't write stable drivers or firmware to save their lives and even more than that, they don't care.

Oh, I've never ever had any issues with SB. But a Google search could solve this conflict in our opinion. I'd better get started...GOOGLING

I can't speak as to professional use (other than to say a little directed googling will show up the flaws in much of their hardware and the occasional outright lies in specs, can't speak to current gen as I blacklisted them), but in their original intended application, the X-Fi series was the #1 stability issue in gaming for years. I discarded mine, so did many others - especially the people who got sued for trying to write their own drivers.

If quality audio is a goal, the #1 step is not to put it in the machine. Seriously, there are few environments on this planet noiser than the inside of a machine like this. #2 is to not have the name Creative stuck on it, IME.
« Last Edit: January 03, 2017, 08:18:27 am by Monkeh »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #27 on: January 03, 2017, 08:47:21 am »
Yep, you guys are right about Creative and their products. A lot of issues there lately. But it is also not just SB, it is also Windows, especially Windows 10. nVidia has the same problems, also Intel on the graphics side. Even Microsoft own hardware (Surface product line) has issues with these devices.

About Win10...I've been a Windows insider since MS started such program, and gosh Win10 has issues, even the officially released versions has issues, and it has nothing to do with bad drivers. MS is as ignorant in these things as Creative or any other big company for that matter.

Windows 7 works great, 8.0 is crap, 8.1 is a lot better than 8.0 and 10.0 has issues everywhere. First thing to do in Windows 10 is to disable FastBoot, especially if on SSD drive, where FB is a useless feature.
FastBoot corrupts memory over time and in the end the whole system. Been reinstalling windows several times because of this annoyance, and only way to fix the issue is to remove FB all together. Second thing to get rid of if having problems is the Hypervisor (used for virtualization and is enabled by default and completely useless if not running any Hyper-V).
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #28 on: January 03, 2017, 02:38:46 pm »
There used to be a problem with SB cards, even expensive ones, where one of the channels was sampled one sample in front of the other. In most scenarios, not a problem, but in some technical scenarios it made a real pigs ear of things. I remember that it took me weeks to figure out what was happening, as I was using the SB card to test my own device. That was, I hasten to add, about ten years ago.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #29 on: January 04, 2017, 02:23:58 am »
Interesting debate on the sound cards. I actually happen to know a decent bit about the audio world, and yes, a lot of Sound Blaster Z cards were never that helpful.  Sound cards as a whole have been going a little bit out of style, replaced often by using DACs, especially if recording live vocals, and instruments. The reason for this is usually because a lot of DACs are advanced enough at this point that if your onboard audio is at least half decent, a sound card won't often bring you much higher.

Of course, some people would disagree, but depending on your area, a sound card isn't always completely necessary, at least not for a huge, huge difference.

blueskull, I actually will be using professional CAD programs, in which I need real time, so that's what the fuss about a Quadro is all about.

I actually was intending to use Windows 10, as I've been using on/off since it's release, and it's been treating me well. You are VERY right however, it has plenty of issues. I had been on 8.1 for a while ( 8 wasn't even funny as a joke), and have been out of the Windows 7 shtick for years. I'm hoping Windows 10 won't start acting up on the new hardware, or it's at least somewhat manageable. It has some good features that have made my life plenty of easier.

On to more computer parts...


I've been researching, and based on my last post, where I described exactly what I would be using, I would like some of you to chime back in, and check out my new (possibly disastrous) build.

I've overhauled everything, basically...

Motherboard: Intel  S4600LH2 Server Board

Would getting a server board be useful to me? This one has room for up to four processors, which is definitely more than necessary right now, but it's going to future proof me, at least. I'll list my new CPU, but like slicendice suggested, I'll be shooting for a 10 core processor. If I get compatible CPU, and I actually need it later on, I could get up to 40 cores or something...

CPU Processor: Intel  Xeon E5-2630 v4 2.2 GHz Ten-Core LGA 2011 Processor

or...

Intel Xeon E5-2690 v2 Ten-Core Ivy Bridge EP Processor 3.0GHz 8.0GT/s 25MB LGA 2011 CPU

I'm not sure which is a better deal, the second one is really expensive though. They're both of the 2600 series, so I can use at least two of them together right?

GPU Graphics card: HP  Quadro M4000 Graphics Card

or...

HP Quadro M6000 Graphics Card

I'm not sure how much of a difference the second one would make against the first one for my uses. By the way, yeah, Unreal Engine 4 is definitely the way to go. Ever since they made it free, it's been wonderful. You don't to be a millionaire to justify using it anymore.

That second card is really, really expensive, though... but I've been following the Quadro recommendations.

RAM memory: HP 726719-128 128GB

or...

Crucial  128GB DDR4 2400 MHz LR-DIMM Memory Kit (4 x 32GB)

I was leaning toward the second one on this one here.

I'll look into sound cards soon, if anything.

I need help with the new components, and if they're compatible, a good idea, and if there is something wrong, what I can do in order to get a compatible equivalent. Also, if the price gaps between some of this would even be worth the difference for what I need.

Thank you again. I'm going to continue researching, and I'll be back on here in a few.


 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #30 on: January 04, 2017, 02:58:41 am »
Motherboard: Intel  S4600LH2 Server Board

Would getting a server board be useful to me? This one has room for up to four processors, which is definitely more than necessary right now, but it's going to future proof me, at least. I'll list my new CPU, but like slicendice suggested, I'll be shooting for a 10 core processor. If I get compatible CPU, and I actually need it later on, I could get up to 40 cores or something...

This is a highly inappropriate board - not only is it EOL, it cannot fit a standard chassis, it has no USB 3, it has no onboard audio, nor does it have any expansion slots at all without a special daughtercard to go in its special chassis. And even then you only get two slots. So no GPU, no sound card, no USB 3, no nothing. There is no benefit in a server board, you are building a workstation.

Quote
CPU Processor: Intel  Xeon E5-2630 v4 2.2 GHz Ten-Core LGA 2011 Processor

Great. Don't try and put those in a four socket board.

Quote
Intel Xeon E5-2690 v2 Ten-Core Ivy Bridge EP Processor 3.0GHz 8.0GT/s 25MB LGA 2011 CPU

Multiple generations old - will not work in a modern board, again will not work in a quad socket board if that's the new plan (why? you can get 22-core CPUs..).

This is the sort of board you are looking for: https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DAL-i.cfm

This will take any Xeon 2600 v3 or v4 CPU.

Memory for this would be along these lines: http://www.crucial.com/usa/en/ct4k32g4lfd424a

This board will also require an appropriate power supply, along the lines of: https://seasonic.com/product/prime-850-w-titanium/
« Last Edit: January 04, 2017, 03:02:36 am by Monkeh »
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #31 on: January 04, 2017, 03:20:54 am »
Multiple posters made the same recommendation that would have regarding ECC.  For serious work there is no good reason not to include it.

I would also consider a real hardware RAID card like an Areca for your 4 hard drives and I would probably go with HGST instead of Western Digital.  Western Digital disabled TLER on their Black drives to make them work worse with RAID several years ago.

I bought the last of the TLER supporting WD Black drives and an Areca 1210 for my last ECC system and they have worked flawlessly.
« Last Edit: January 04, 2017, 05:51:16 am by David Hess »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #32 on: January 04, 2017, 04:33:48 am »
I understand you Monkeh.

This Motherboard seems a little bit more advanced, and is basically the same as your recommendation, right?

I'll look at the power supply, and the RAM is still good for that board, too?

I would need a Xeon 4600 series for a 4 slotted Motherboard, right?

There's no plan on a 4 slot, really, I was just wondering, in the case I needed more cores, but there's no point in it now.

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

David Hess, yes, I will definitely be using ECC memory.

Also, thank you for the information on Western Digital. I will start looking into good drives from HGST.

I'm also looking for a better PCI-E SSD than what I said earlier.

What do you think about this one?

OCZ RevoDrive 3 X2

Compatible, could I do better?

Any opinions on the best Quadro graphics card?
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #33 on: January 04, 2017, 05:57:33 am »
I'm also looking for a better PCI-E SSD than what I said earlier.

What do you think about this one?

OCZ RevoDrive 3 X2

I would look for SSD drives which include power fail protection which pretty much rules out anything made by OCZ and second tier manufacturers.  Intel and I think Samsung make some.  Unfortunately there are no good lists of which ones have it and which ones do not.

I would also plan on a UPS.  I prefer online types but standby units are almost always good enough.

And IPS monitors of course.  I have regretted accidentally getting TN panels in my last pair of monitors.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #34 on: January 04, 2017, 07:02:29 am »
I found an intel with power fail protection...

http://ark.intel.com/products/86739/Intel-SSD-750-Series-1_2TB-12-Height-PCIe-3_0-20nm-MLC

Do you believe that 2400 read and 1200 write speeds would be quick enough?

I found another quad core mother board that will work with the Xeon 4800 series, if I wanted to go all the way up to 96 cores one day...

Supermicro X10QBL-4CT

Here was the processor for it...

Xeon E7 4830 V4.

Maybe a little bit too expensive of a set for right now, but I think they would do well. It's not high priority right now, but are there any other good workstation mother boards that have quad slot, for up to four CPU cores?

Other than that, I think I have good options for CPUs for either the Dual or the Quad builds. I still need to confirm a motherboard, and that graphics card.

Also, do not ever worry about the chassis that any kind of motherboard can or can't fit, it's basically irrelevant. I'll basically set everything up in a personally created rack, so I'll build it around whatever I wind up with.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #35 on: January 04, 2017, 08:27:10 am »
For workstation usage, you will never ever need 4 CPUs. NEVER! Only reason you would need that many CPUs is if you need like 1TB of RAM and run heavy virtualization for multiple servers where each server has it's own purpose.

Go with a single or dual CPU design that supports triple or quad channel DDR4, ECC, REG and Intel XMP memory profile with low latency. Another thing to look at is if the MB supports PCIe 3.0 x16. (you will need that for your Graphics card and for any SSD or RAID solutions you stick into those slots for best performance.)


If you need the best Graphics card on the market for your CAD stuff, then get the latest Quadro. It's insanely fast and has a lot of ECC GDDR5 memory.
QUADRO M6000 24GB QUICK SPECS
CUDA Parallel-Processing Cores3072
GPU Memory24 GB GDDR5
Max Power Consumption250 W
Graphics BusPCI Express 3.0 x16
Display ConnectorsDP 1.2 (4), DVI-I (1), Optional Stereo (1)
Form Factor4.4" H x 10.5" L Dual Slot
MORE INFO
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #36 on: January 04, 2017, 09:53:12 am »
Don't get a retail version Xeon E5, there are cheaper ones leaked from OEM system builders that are more powerful and cheaper.
OEM version is just cheaper, with equal quality. Don't touch ES/QS versions as these are not even correctly specced and many mobos won't even boot them.

Yes, I got caught with exactly that, a pair of E5-2696v2 ES QDUF looked a steal at $250 each for 12 cores, but my Supermicro X9DRi-LN4F+ mobo wouldn't even start to boot POST, and the BIOS had be updated to the latest version which supposedly supports E5-2600v2 series. I put the old E5-2670v1 pair back in.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #37 on: January 04, 2017, 05:40:30 pm »
Don't get a retail version Xeon E5, there are cheaper ones leaked from OEM system builders that are more powerful and cheaper.
OEM version is just cheaper, with equal quality. Don't touch ES/QS versions as these are not even correctly specced and many mobos won't even boot them.

Could Intel possibly make things any more complicated?

After Intel dropped ECC support for non-Xeon processors after the Pentium 4, I switched to AMD to avoid the complications caused by their market segmentation.  It is only recently that Intel ECC systems have come down in price (Core i3 socket LGA 1151) and they are lower performance than available AMD systems.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #38 on: January 05, 2017, 07:43:51 am »
slicendice, thank you for your reply again. From what I understand, the cores idea was mainly for the purpose of rendering specifically. Rendering uses multiple cores, usually the more you have the better. Even for video rendering, the GPU doesn't help much (unless you have a Quadro, like you did indeed suggest, thank you!), but rendering relies on CPU cores, so that is what I was thinking about with a 4 CPU board.

That $4000 price tag is what was holding me back from that one. It's definitely the best, I'm sure, but what kind of difference are we talking about, between that one, and this one?

https://www.bhphotovideo.com/bnh/controller/home?O=&sku=1207157&gclid=CJ-2wM6UpdECFceIswodEpIAoA&Q=&ap=y&m=Y&c3api=1876%2C92051677442%2C&is=REG&A=details

Looking at it, I see the immediate difference of 8GB, and 24GB.

blueskull, I'm confused at what exactly you mean? Are you saying to look for one that is cheaper? Another user earlier advised me to stay away from older generation CPUs. Could you further explain, and/or link me to somewhere I could find one that is compatible with these motherboards?

(Monkeh, I made a mess of the last post I replied to you with, I meant to ask you if this board looked any good with the components so far? It looks like the board you suggested, right? It has a bit higher specs though, I think.)

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

or...

Supermicro X10QBL-4CT

That one is the quad CPU one though, so at least the first one there?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #39 on: January 05, 2017, 08:31:48 am »
For your video editing purposes, all cores beyond 10 will be pretty much idle with your Adobe software. No point in paying for a few extra "heaters" in your rack.

For your 3D/CAD purposes, same thing, with a good GPU, your CPUs will all be almost idle.

For your Audio purposes, all your CPUs will be almost idle if you have a good soundcard/external sound dock/audio processor.

For your programming purposes an average computer with decent GPU for gaming is good enough. ( I develop software on a laptop, and it's really old, and no need to buy a new one yet)

If you go for a 4 CPU design you will be paying for nothing.

A Dual CPU design can be of value, if you need more RAM, but that is pretty much the only reason. In a dual CPU design you will still have an insane amount of RAM and even in single CPU design if going for Xeon, you will have a lot of RAM.

EDIT: My laptop has only 8GB RAM which is enough for game/application development and virtualization. If I need server stuff in my virtualization while testing games/applications, then 16-32GB RAM is enough, depending on what the server software requirements are.
« Last Edit: January 05, 2017, 08:35:15 am by slicendice »
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #40 on: January 05, 2017, 09:46:17 am »
A Dual CPU design can be of value, if you need more RAM, but that is pretty much the only reason. In a dual CPU design you will still have an insane amount of RAM and even in single CPU design if going for Xeon, you will have a lot of RAM.

Xeons support more RAM but it is more cost effective to maximize the number of DIMM sockets than to use the highest density DIMMs.  Dual and perhaps quad CPU motherboards are useful just because they support more DIMMs.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #41 on: January 05, 2017, 10:33:31 am »
Which one is cheaper? Buying 2 more RAM modules or buying 2 more RAM modules and a CPU?

I don't think he will need anything like THIS
« Last Edit: January 05, 2017, 10:37:42 am by slicendice »
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #42 on: January 05, 2017, 11:29:36 am »
Which one is cheaper? Buying 2 more RAM modules or buying 2 more RAM modules and a CPU?

I don't think he will need anything like THIS

I am sorry if I was not clear.  When building a system for maximum memory, it can be less expensive to use a dual processor motherboard with twice as many DIMM sockets than to double the density of the DIMMs.  The reason I would consider an Intel system with 3 instead of 2 memory channels is simply because more memory can be installed and not because of increased CPU performance.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #43 on: January 05, 2017, 11:33:27 am »
He does not need that amount of RAM and why settle with 2 or 3 way memory when Xeon support 4way. Usually 8 RAM sockets which is plenty for this build.

Which one is the cheapest?

1 CPU and 4 x 32GB RAM
2 CPUs and 8 x 16GB RAM
4 CPUs and 16 x 8GB RAM

Remember to consider the fact that any cheap Xeon can be bought for 1 CPU configuration
Consider the fact that the Xeons for dual CPU support cost a lot more
Consider the fact that Xeons that support 4 or more CPU configurations cost even more.

And not to forget how much extra heat is produced with a 4 CPU configuration vs a 1 CPU configuration.
What about power? 4x 140W CPU + a 240W GPU requires a much bigger PSU.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #44 on: January 05, 2017, 11:43:47 am »
HERE is a good example of many different PC builds for the video editing part. This could be used as a gudeline.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #45 on: January 05, 2017, 11:50:49 am »
You will have to run the numbers to find the best RAM/cost point.  Quad CPU motherboards are definitely past the point of diminishing returns for the reasons you identify.  Dual CPU motherboards are competitive.

An additional complication I have noticed is that increasingly since the earlier DDR DRAM generations, not all DRAM densities are available in all DIMM types.  Last time I checked, there were no high density DIMMs available for desktop motherboards.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #46 on: January 05, 2017, 01:45:30 pm »
We are talking about professional ECC REG memory here. Not some gaming PC memory where none of that matters. But I see your point.

64GB modules are easy to find, 32GB even easier and they don't cost that much. In a quad channel memory configuration 4 x 64 GB is 256GB which is way more than needed. Even 128GB is a lot. in a 4-way system it is most likely possible to get 8 x 64GB, which equates to 512MB RAM (total overkill), but this depends on the motherboard.

One point I want to make regarding 8 DIMM in a 4 channel configuration. The performance will be worse compared to only using 4 DIMM.

Another point to make is that in dual CPU configuration and dual channel system there MUST exist 2 DIMM per CPU to meet the minimum requirements or the device won't even start. If going for Quad channel, then we need minimum of 8 DIMM to get the computer booted. Of course this also depends on other motherboard factors, but in general this is the case when talking pro grade motherboards.

Either way all these facts does not really help much in deciding what to build.

Only important factors are:
8-10 CPU cores in total, Single or Dual CPU (optimized performance/cost for all Adobe products)
fast GPU with a lot of GRAM (for CAD/3D)
at least dual channel preferrably quad channel DDR4, ECC, (REG for Xeon), XMP (for additional performance boost), 2133+MHz and low latency (more important than frequency) at least 32GB in total, preferrably 64-128GB, everything above that is questionable if ever needed for the current usage.
enough SATA3 ports for the 4 HDDs, unless using a RAID card
support for 2 x 16x PCIx (simultaneously if using a RAID card and GPU)
Quality components(especially caps) and 8-phase voltage regulation on MB for increased stability
M.2 x4 NVMe full length support on MB (needed for fast disk read write cache)
M.2 x4 SSD (Preferrably Samsung NVMe SSD sticks)
fast SSD for system boot(preferrably Samsung SATA3 SSD because they are fast and cheap 512GB is more than enough for all apps and some downloads, can be upgraded to 2TB later by just cloning the original drive and inserting the new one)
Big enough Modular PSU so it can run all the hardware at maximum efficiency without running out of juice.

I probably forgot a bunch of stuff but this is to get started with a high performance workstation for around $5k.

It should be possible to upgrade this system later if choosing the Motherboard carefully.
 

Offline senso

  • Frequent Contributor
  • **
  • Posts: 951
  • Country: pt
    • My AVR tutorials
Re: Proofreading my computer design and building a server rack station.
« Reply #47 on: January 05, 2017, 02:34:24 pm »
1000Watts PSU is a bit dumb....
The    i7-6850K is a 140Watts TDP cpu, the 1080 as a TDP of 180Watts, your HDD's may burn 20Watts each(not really, but lets say that it is)..
An SSD will also state that it needs as much power or even a bit more than an HDD, so say 20Watts as well for the SSD.
Now 8*20Watts(assuming combo of 8 HDD's+SSD's) + 180 +180(assuming you are OC'in your CPU) and lets allow 80Watts for the MB, array of RAM and array of fans, that gives you a grand total of 600Watts IF you put everything into full load, normal usage will hover in the 200Watts or less, now your 1000Watts PSU is operating in a much lower efficiency state than if you just went for a 750Watts, and even that is overkill..

I know that he is dumb af, but linus tested an i7-6700 with SLI 1080's and the power usage at the power socket was around 500Watts, people like to put gigantic psu's and then run them with low loads so they are less efficient than choosing a properly sized PSU..

NVMe is PCIe, and not SATA3, why would you want 4xNVMe slots and then talk about SATA3 Samsung ssd's?
Also, with 4x NVMe you yould need 4x 2 = 8 lanes of PCIe, plus 32 lanes for two 16x PCIe cards and you are out of PCIe lanes , so you might see some loss of performance when you had some more things(like the intel ethernet interface that you can get from the get go you just buy a decent motherboard and no the one with more leds), not the end of world, but that looks like not accounting for everything and just shooting for a list of expensive things.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #48 on: January 05, 2017, 03:02:43 pm »
Good points, but this the reason I mentioned modular PSUs, they reach almost optimal performance from almost no current to Max current. It is never a good idea to under size a PSU. Trust me, been there done that, and blown up many computer because of this.

About SSDs, good brands like Samsung and many other consume a fraction of power compared to a traditional HDDs.

Depending on how much free space there will be on the motherboard for upgrades I would over measure the PSU, so it is ready for the upgrades later.

When choosing a PSU it is worthwhile to check all the performance graphs at different power consumption. Good brand always have these. Quality PSUs generally cost more.

Either way, this build is not about power consumption, it's about performance.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #49 on: January 05, 2017, 03:08:49 pm »
hahaha, I am not talking about 4xNVMe I am talking about M.2 that support x4 bandwidth which is in spec with the fastest NVMe SSDs today.

The NVMe card (2400MB/s R/W) will be used as a cache for the HDDs while the normal SSD (550MB/s R/W) will be used as the boot drive and application drive. You don't need faster SSD for booting up your computer or starting apps, but for caching it is a good idea to have a really fast drive.

We definitely don't want to use our Boot drive as a cache drive. That would be the most stupid decision ever.

EDIT: Oh and we are not building a gaming rig here that needs 4x PCIe @ x16 simultaneously.
« Last Edit: January 05, 2017, 03:19:06 pm by slicendice »
 

Offline senso

  • Frequent Contributor
  • **
  • Posts: 951
  • Country: pt
    • My AVR tutorials
Re: Proofreading my computer design and building a server rack station.
« Reply #50 on: January 05, 2017, 04:50:02 pm »
M.2 can have USB, SATA(1,2 or 3), PCIe(from 1x to 4x), also as i2c and some extras, M.2 is a physical connector, not a protocol.
Even PCIe SSD's can be either AHCI or NVMe, the AHCI ones are slower than NVMe ones, and a bit rare today.
You feel the difference between a SATA SSD and a NVMe drive as the OS/programs drive, one of my laptops support both, and the NVMe is a bit faster and even more snappy than SATA SSD.

Why is a modular PSU more efficient than a non modular one?
The cables can come out, but that doesn't interfere with the PSU rating, if you get a bronze modular vs a platinum non modular, the later will have better efficiency.

I wouldn't use an SSD as a cache, if you are going the route of x99 cpu's, throw RAM into it and use a RAM disk.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #51 on: January 05, 2017, 05:46:38 pm »
The efficiency curve for power supplies is not that sharp and power derating the power supply results in higher reliability.  I am getting really tired of replacing improperly derated capacitors in PC power supplies.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #52 on: January 05, 2017, 06:40:22 pm »
Why is a modular PSU more efficient than a non modular one?
The cables can come out, but that doesn't interfere with the PSU rating, if you get a bronze modular vs a platinum non modular, the later will have better efficiency.

I wouldn't use an SSD as a cache, if you are going the route of x99 cpu's, throw RAM into it and use a RAM disk.

Using SSD as a disk cache is all about reliable persistence. Using a RAM drive as disk read/write cache is not reliable, pointless actually, when you want to persist data.

RAM is overall more reliable (especially ECC), faster, and survives more read/writes. But that is does not fulfill the requirements of the Read/Write cache for persisting data. RAM is temporary, a disk cache is NOT.

How many servers have you seen that uses RAM-drives as Read/Write cache for databases? I bet 0! They use NVMe drives in RAID configuration to speed up the read/write of the most recent or most recently accessed data.

Modular PSUs have multiple power rails and the best ones can intelligently enable and disable and also switch/combine the Rails on the fly depending on load on each output cable. This improves/Optimize overall efficiency. No ordinary PSU can compete with that.

Best part of modular PSUs is that you can remove all extra cables and make the case more tidy and have better airflow to all circuits.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #53 on: January 05, 2017, 06:42:30 pm »
The efficiency curve for power supplies is not that sharp and power derating the power supply results in higher reliability.  I am getting really tired of replacing improperly derated capacitors in PC power supplies.

 :-+
 

Offline radar_macgyver

  • Frequent Contributor
  • **
  • Posts: 698
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #54 on: January 05, 2017, 08:20:52 pm »
Using SSD as a disk cache is all about reliable persistence. Using a RAM drive as disk read/write cache is not reliable, pointless actually, when you want to persist data.

+1. Yet I still see people recommending this (as in, "I just bought a workstation with 96 GB RAM, I'm going to use 32 GB to make a RAM disk and put my swap file there"). Doesn't make sense because all modern operating systems do this anyway with unused RAM (ie, use it to cache disk reads). It's taken to an extreme with certain filesystems like ZFS.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #55 on: January 05, 2017, 08:52:03 pm »
Just to make this Ramdisk or SSD write cache a bit more clear....

Scenario:
I produce 2 videos on 2 separate computers, computer A and computer B. On A I use a RAM disk, on B I use an SSD cache. Both videos are the same, batch produced from a 3d software into 4k video , are multiple hours long and are first encoded/rendered into raw video, then encoded into MPEG-4 (pick any standard and container you want, does not matter)

For best quality / best compression ratio and after the perfect filters added, it takes me 3 full passes of encoding before the compressed video is in it's final state

Each pass takes about 8 hours to complete, and the rendering takes about 6 days.


Now what happens after 6 days and 22hours (6h into 3rd pass of video encoding) of encoding and rendering is that we we have a power loss. All electricity GONE!

Question:

1. What happened to the data in the computer with the RAM disk during the power loss?
2. Could all data fit in the RAM disk?

3. What happened to the data in the computer with the SSD cache?
4. Could the data fit into the disk array?

5. Where can I expect to continue my rendering/encoding process, on computer A and computer B, after the power comes back on and computers have rebooted?

6. How many hours difference in rendering/encoding time will I have between A and B?

Edit: added numbers to the questions for better separation/identification
« Last Edit: January 05, 2017, 08:56:11 pm by slicendice »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #56 on: January 07, 2017, 09:13:41 am »
Thank you for all of your detailed posts. Sorry, I have not replied in a little while, I was busy.

Slicendice, thank you for that detailed list that you put up. I will be putting up a list of my current selection based on everything that has been suggested to me, tomorrow.

The quad CPU is definitely not necessary anymore, so I will be sticking to a dual CPU instead. I think it should be enough. Thank you for clarifying that. The 10 core thing came to mind after I had already posted.

Does anybody think that this motherboard is a smart choice?

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

Like I said, tomorrow I'll be posting a full list of the components I have compiled so far. I have also finished the idea for the custom built server rack that  I will be building the computer into, and I would like to run it by everyone. I want to make sure I will have all of the proper cooling equipment.

Thank you again, everyone!
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #57 on: January 07, 2017, 09:52:00 am »
You're welcome!

Can you please put together a short list of alternative dual Xeon motherboards including approximate price and links to specs?
 

Offline radar_macgyver

  • Frequent Contributor
  • **
  • Posts: 698
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #58 on: January 07, 2017, 09:04:36 pm »
Does anybody think that this motherboard is a smart choice?

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

PCIe slots are x8. This will slow down most GPUs which have x16. It's also a proprietary form factor, which means it will only work with a specific Supermicro chassis.

Better option would be to look under their barebones and pick a system that meets your needs. For example, the 7048A-T, a tower/rack convertible, advertised as quiet - this is important because most rackmount servers sound like mini jet engines. This is why I was surprised you wanted a rackmount as a workstation. Need more GPUs? Consider a 7048GR-TR with support for up to four, with a 2kW PSU.

Supermicro boards have a reputation of being somewhat picky about their memory modules. Sticking to the recommended list will avoid headaches.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #59 on: January 08, 2017, 12:24:13 am »
Yes I got a bit worried about the Supermicro choice too, this is why I would like to see a list of possible alternatives.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #60 on: January 08, 2017, 03:55:10 am »
Does anybody think that this motherboard is a smart choice?

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

PCIe slots are x8. This will slow down most GPUs which have x16.

Slow down? Good luck even installing them.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #61 on: January 08, 2017, 04:16:15 am »
Does anybody think that this motherboard is a smart choice?

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

PCIe slots are x8. This will slow down most GPUs which have x16.

Slow down? Good luck even installing them.

I'll be surprised if any GPU doesn't work at lower PCIe lanes. Usually a x16 GPU can happily work on even x1 PCIe link.
Performance penalty is not severe either. In fact, most GPUs don't really need x16 speed at all unless running some bandwidth intensive operations such as GPGPU with badly optimized host memory accessing pattern.
When running graphics tasks such as gaming, x8 has no perceptible difference compared to x16, according to LinusTechTips' benchmark (search "4 way SLI").

......

You cannot physically install them in the slots.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #62 on: January 08, 2017, 06:02:44 am »
You cannot physically install them in the slots.

There are x16 physical slots with only x8 or even x4 signal connection, commonly seen on mobos. In fact, consumer grade Intel chips only have 16 PCIe lanes, so to have a dual GPU configuration, they must operate at x8 mode, let alone 3-way SLI configuration, which usually has x8/x4/x4 or x4/x4/x4 configuration if NVMe SSD is used.

Irrelevant. Look at the board we are discussing.

Quote
There are advanced enthusiast level mobos with multiple true x16 slots supported by a PLX chip, which is essentially a PCIe hub that allows "on demand" bandwidth allocation, instead of fixed or boot-time selectable allocation implemented on cheap boards.

Irrelevant.

Quote
Also, keep in mind that you can always hacksaw (pun intended) your PCIe slot to make it a semi-open slot to accommodate any length of cards if you wish. Though ID pin will be missing, auto link negotiation will still make sure the connection will work at maximum bandwidth.

Somehow I knew you'd bring this up, and if you want to take a hacksaw to a $650 motherboard, more power to you. Nobody else does.
 

Online MK14

  • Super Contributor
  • ***
  • Posts: 4539
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #63 on: January 08, 2017, 06:21:47 am »
I did this on my SuperMicro C226 mobo to fit an Intel Xeon and make it coexist with GTX750Ti.
I did similar things (though non destructive, just riser cards) to my AsRock X99 to bring M2 slot to U2 for SSD, and to bring mSATA/mPCIe to PCIe.

If you ever fix any of my computer equipment. I will insist that you DON'T, bring the hacksaw. Please leave the hacksaw, safely under lock and key at home.  :-DD (I'm both 100% completely joking, and 100% serious at the same time, if you understand what I mean).
tl;dr
If I were to have a brand new, £10,000 Workstation. I would NOT like someone to use a hacksaw on the motherboard, to make it. (ignoring the initial PCB production, obviously).

EDIT:
Googling the hacksawing. Apparently some people, who buy the servers when they are rather old, and not worth very much, e.g. £100.
Might use the hacksaw method, to the graphics card (a cheap one) or the motherboard, so they can put in a powerful gaming Graphics card.
But I think it is a different matter, if it an old £100 server, used as a spare gaming machine or something. Rather than a >£5,000?? workstation for serious work. Which might need to be returned under warranty.
« Last Edit: January 08, 2017, 06:42:08 am by MK14 »
 

Offline radar_macgyver

  • Frequent Contributor
  • **
  • Posts: 698
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #64 on: January 08, 2017, 07:40:25 am »
You cannot physically install them in the slots.

Yes you can (look closely at the right edge of the x8 slot):



On the motherboard in question, note the large open space below the PCIe slot area. This is to take up any overhang from x16 boards.

I'll be surprised if any GPU doesn't work at lower PCIe lanes. Usually a x16 GPU can happily work on even x1 PCIe link.
Performance penalty is not severe either. In fact, most GPUs don't really need x16 speed at all unless running some bandwidth intensive operations such as GPGPU with badly optimized host memory accessing pattern.
When running graphics tasks such as gaming, x8 has no perceptible difference compared to x16, according to LinusTechTips' benchmark (search "4 way SLI").
You may be right, especially with PCIe gen 3 which can stuff a lot of data down each lane.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #65 on: January 08, 2017, 08:26:34 am »
As a quick note again, do not worry about the size of the motherboard or what it can fit into. I am going to customize something myself in order to build the PC into it.

I did have some other alternative boards that I posted earlier, but they weren't smart choices, or they were quad CPU boards, which turned out unnecessary. The supermicro decision was a recommendation from Monkeh. That wasn't the suggested board itself, but another supermicro board.

The only other board I was looking at was this one...

https://www.asus.com/us/Motherboards/Z10PED8_WS/

I think it's even worse off than the supermicro one at this point, though.

I'm going to list all of my current pieces. Would someone be willing to help me pick a good motherboard? Again, price is no concern. I just want it to be something future-proofed. Something that I can continue to upgrade further and further in the future.

Here are my components so far...

RAM memory: http://www.crucial.com/usa/en/ct4k32g4lfd424a

Graphics card: https://www.bhphotovideo.com/bnh/controller/home?O=&sku=1207157&gclid=CJ-2wM6UpdECFceIswodEpIAoA&Q=&ap=y&m=Y&c3api=1876%2C92051677442%2C&is=REG&A=details

CPU Processor X2: https://www.bhphotovideo.com/bnh/controller/home?O=&sku=1226971&gclid=CI_okcWSpdECFdeEswodSLcOWQ&is=REG&ap=y&m=Y&c3api=1876%2C92051677442%2C&A=details&Q=

SSD PCIe: http://www.newegg.com/Product/Product.aspx?Item=9SIA12K4ND8252&ignorebbr=1&nm_mc=KNC-GoogleMKP-PC&cm_mmc=KNC-GoogleMKP-PC-_-pla-_-Solid+State+Disk-_-9SIA12K4ND8252&gclid=CKPyxYqPstECFc5LDQodKOEJ6g&gclsrc=aw.ds

I haven't found anything wrong yet, except that I don't have a motherboard yet. These were all taken and/or based off of other users suggestions.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #66 on: January 08, 2017, 09:24:02 am »
For this build I really don't see any reason to build a dual CPU system. If it was going to be a do it all server or a virtualization station then the more CPUs, the better as a lot of CPUs gives a lot more cores/threads and RAM too.
 

Online MK14

  • Super Contributor
  • ***
  • Posts: 4539
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #67 on: January 08, 2017, 04:26:30 pm »
If you ever fix any of my computer equipment. I will insist that you DON'T, bring the hacksaw.
If I were to have a brand new, £10,000 Workstation. I would NOT like someone to use a hacksaw on the motherboard, to make it.

In fact, I use a Dremel Micro (regretted not getting a Dremel 12V version) with 405 saw blade tool ;).
BTW, my main workstation is not a cheapo by any means, especially considering its water cooled 22-core Xeon E5, 64GB REG ECC RAM and 1.2TB Intel SSD plus a pair of Dell premium line LCD monitor plus $5-digit I spent on software. The money I spent on RAM and SSD along can build a top gaming machine.

I think it is just that opinions/methods can vary, as regards, physically modifying computer equipment. I.e. It is not necessarily a matter of either of us being right or wrong. Just our opinions are varying about this.



To the OP. There are some big developments, coming to server computers, in 6 months (estimated, very approximate), and later.

AMD are going to be releasing new server processors (they have been quiet on the server front for many, many years), which may dramatically lower server cpu prices and/or improve the number of cores you get, for your money.

Intel are releasing the next generation of server processors, with the significantly improved platform.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #68 on: January 08, 2017, 04:58:11 pm »
You cannot physically install them in the slots.

Yes you can (look closely at the right edge of the x8 slot):



On the motherboard in question, note the large open space below the PCIe slot area. This is to take up any overhang from x16 boards.

Some slots being open backed != all slots being open backed.

Sadly in this case, having checked the manual for actual details, your assumption is closer to correct - most slots on that board are actually open backed.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #69 on: January 09, 2017, 07:02:42 am »

Quote
There are OEM version chips out there that are cheaper and more powerful. These are NOT the crappy non-reliable ES/QS ones (which are essentially silicon lottery).
OEM chips are massively produced chips that target only big customers like Oracle or Cisco, that build specialized servers in massive quantity.
Some of the chips are leaked to eBay through grey market. There is a vivid market that smuggles OEM chips to evade IP cost impinged on retail version counterparts.
I would suggest a single socket E5-2679v4 or E5-2696v4, which is more powerful than 2*your chosen CPU, at only slightly higher price, plus it doesn't have QPI performance penalty for multi-socket systems.
When you really need more power, just add another one to double the performance.

My E5-2696v4 absolutely rocks, it is only 1/3 the price of its retail counterpart, E5-2699v4, while all specs are exactly the same except for max turbo boost frequency -- it is actually 0.1GHz higher than 2699v4, and the specific step (silicon revision) I have can turbo boost to 2.9GHz when all cores are used even in AVX mode while maintaining <110W power dissipation.
The use of single socket instead of dual socket not only circumvented QPI performance penalty, but also saved my $$$ on a proper server mobo as I can use a regular X99 mobo with tons of DIYer friendly features. What's more, it makes ITX form factor possible as now I can own the world's most powerful shoe box computer.

blueskull, thank you for your advice. I looked into the processors you suggested, and I think I will be getting one of those. They seem to be a much better deal.

As for the rest of my components, are they any good?

Most importantly, can anyone suggest to me a good motherboard, please? I've continued searching, but I can't find one that is any better than my other suggestion by supermicro?

I'll be back on tomorrow with my concept for the server rack build.

 

Online MK14

  • Super Contributor
  • ***
  • Posts: 4539
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #70 on: January 09, 2017, 07:04:31 am »
I just want it to be something future-proofed. Something that I can continue to upgrade further and further in the future.

That's why I mentioned that (or hinted at) Intel every (approx) ten years, brings out a dramatically new/improved overall scheme of things, for servers/workstations. We are getting somewhat close to the next big one.
I'm NOT sure if it is weeks, months or later this year, away. Maybe someone who knows more about the release schedule (probably secret) can chime in.

But buying now, is not the end of the world, and will still be sort of upgradable for some time.
I'm looking forward to Skylake-EP, the new platform purley, and its improvements.
Since the exact details tend to be by rumors/leaks, I can't accurately attempt to put you off buying the current motherboards and cpus.

If you need the thing NOW, then it can't be helped.

N.B. I'm known to be too enthusiastic about new/upcoming products, and tend to over-emphasis waiting for them. So feel free to ignore this post and buy the existing one.

But the most future proof system, is the next one (hopefully not too long away). Because it will remain for something like ten years (ultra approx), in its basic overall concept.

tl;dr
Due to secrecy surrounding the new system, I'm not sure how much better it will be. But it sounds very promising, in some areas.
« Last Edit: January 09, 2017, 07:07:52 am by MK14 »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #71 on: January 09, 2017, 07:40:46 am »
For the last 5-10 years, nothing really impressive has been released regarding CPU performance. On Intel chips the focus has been on less power consumption and faster integrated graphics, that's pretty much it. I don't expect the next generation of CPUs to be that much different. Only AMD is going to release it's next gen CPU and GPU really soon, which sounds amazing, but is it worth to wait? Maybe, maybe NOT. Only future will tell.

I think you should at least take a look at ASUS Professional (WS line) motherboards, they are quite reliable and has a lot of features for a decent price. But as I stated earlier, you don't need something like Dual CPU system with 32-cores on each CPU, waste of money for your purposes.

Only real valuable upgrades later on would be to have room for more HDDs and RAM possibly Graphics card, but if you're going for a Quadro then an upgrade will be quite expensive.
 

Online MK14

  • Super Contributor
  • ***
  • Posts: 4539
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #72 on: January 09, 2017, 07:45:58 am »
I'm looking forward to Skylake-EP, the new platform purley, and its improvements.

There are already top of the line chips leaked out for LGA3467 platform as well as mobos from SuperMicro on Taobao.
The top of the line chip is 32-core, 2.1GHz E5-2699v5, which has a 220W TDP.
The currently revealed E5v5 has 28 cores at 1.8GHz, but the new unreleased beast will have 32 cores at 2.1GHz.

Even though, 220W TDP makes it not anywhere more advanced compared to v4 platform -- 220W/32/2.1GHz=3.27W/GHz per core, and current flagship E5v4 has 145W TDP, which corresponds to 145W/22/2.2GHz=3.00W/GHz.
Also, it is unknown what will be all-core turbo boost frequency of the new chip, while the v4 generation can turbo boost up to 2.8GHz when all cores are crunching numbers.

I'm also hoping that it may allow people to get dual socket motherboards, WITHOUT needing to get hold of special (often server only), special "NUMA" aware software. Which can properly exploit dual socket cpu systems, as regards memory. I have not seen details yet, showing to what extent Intel have achieved this. There will probably still be a need/improvement, by using special software (NUMA etc).
It is perfectly possible Intel show little or no improvement in this respect. I just don't know yet.

Because (as you said in an earlier post), the existing two socket motherboards, have relatively slow inter-memory (QPI) links between them. Making NUMA aware/capable software somewhat important, if you want to get the best performance out of it.

Hopefully the new platform (Purley) will have much better/faster memory links between the two cpu sockets.

If not, a maximum of 32 cpu cores (probably more later, on the new/fresh platform), even on a single socket. Does not sound too bad.

The next one will be rather "future proof", because it will take (the motherboard), at least two or more generations of Intel server chips. I.e. will remain current for probably 5 or more years (NOT guaranteed), for the new socket type.

Whereas the existing/current one (the OP seems to want to get), has probably seen the last new Intel cpu, it is ever going to get, and will gradually disappear completely, once the new ones come out. But it will be still supported, for some time.
« Last Edit: January 09, 2017, 07:53:36 am by MK14 »
 

Online MK14

  • Super Contributor
  • ***
  • Posts: 4539
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #73 on: January 09, 2017, 07:56:56 am »
For the last 5-10 years, nothing really impressive has been released regarding CPU performance.

For some people, it may be considerably faster. Because of the new platform "Purdy". The existing platforms have limited the available performance, this limit maybe considerably extended in the upcoming new platform.

E.g. The inter-cpu memory links, are especially performance critical (for some software), and may be considerably faster, by a huge factor, in these new processors.

Or to put it another way. You can currently get lots and lots of cpu cores, with the two sockets. Something like 44 cores in total. But for some software, you CAN'T properly/usefully use so many cores. Because it spends all its time, transferring stuff across the memory links and other delays.
A well designed, new platform, can considerably speed up those memory links (and other stuff), so that instead of only being able to use (say) 15 of the 44 cores, usefully.
You may then be able to use 55 out of a total of 64 cores, giving a huge speed improvement.
I.e. using 55 cores instead of 15, because of the memory link bottleneck.
« Last Edit: January 09, 2017, 08:05:14 am by MK14 »
 

Online MK14

  • Super Contributor
  • ***
  • Posts: 4539
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #74 on: January 09, 2017, 08:11:54 am »
Because (as you said in an earlier post), the existing two socket motherboards, have relatively slow inter-memory (QPI) links between them. Making NUMA aware/capable software somewhat important, if you want to get the best performance out of it.

Correct. The new UPI might help, but I don't expect OmniPath to be equipped on low cost LGA2066 platform. It will be an LGA3467 exclusive feature.

The next one will be rather "future proof", because it will take (the motherboard), at least two or more generations of Intel server chips. I.e. will remain current for probably 5 or more years (NOT guaranteed), for the new socket type.

I will think twice. Starting from Skylake, HEDT will not be Xeon compatible. There will be an X299 mobo that uses LGA2066, targeting workstation and enthusiast gamers, and there will be a C622 mobo that uses LGA3467, which targets multi-socket servers or MIC chips.
Since starting from then, cheap grey market server CPUs won't be able to be used on cheap consumer grade mobos, I would say the platform will be considerably more expensive compared to current X99/LGA2011-3 platform. You either have to pay Intel for more expensive server mobo, or retail CPU. Use of OEM CPUs on consumer mobo will be the past.

Intel may be being a bit too interested in their high profit margins, by not allowing the xeons to work on the cheaper/consumer motherboards. But AMD's upcoming new zen (and different code names) cpus, will hopefully provide much needed competition. So that Intel will play better/fairer in the future.

You should still be able to buy reasonable (but a lot more expensive) single server socket motherboards, for the xeons. So it should not be so bad. Except your tiny ITX water cooled computer, may no longer exist as an option, if you are unlucky about it.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #75 on: January 12, 2017, 07:22:09 am »
This is my current list of components for a $6000 computer system.

Motherboard - ASUS Z10-PE WS

Memory RAM - Samsung 64GB PC4 19200 DDR4 ECC RAM (I'll add more as I go along)

Graphics card - PNY Quadro M4000

SSD - Intel 750 Series PCI-E 1.2TB

CPU - Intel Xeon 10-core processor E5-2630 V4 (I will add another as I go along)

Can anyone suggest a good power supply that is reasonable, and any other accessories that will prevent this computer from being damaged or exploding? Thank you.

 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #76 on: January 12, 2017, 07:50:16 am »
This is my current list of components for a $6000 computer system.

Motherboard - ASUS Z10-PE WS

Memory RAM - Samsung 64GB PC4 19200 DDR4 ECC RAM (I'll add more as I go along)

Graphics card - PNY Quadro M4000

SSD - Intel 750 Series PCI-E 1.2TB

CPU - Intel Xeon 10-core processor E5-2630 V4 (I will add another as I go along)

Can anyone suggest a good power supply that is reasonable, and any other accessories that will prevent this computer from being damaged or exploding? Thank you.

Still looking for some PSU alternatives.

Meanwhile at least consider purchasing a Samsung 960 EVO Pro 1TB or 2TB card that goes in the M.2 slot instead of the Intel 750. It's almost same price (the 2TB model) as the Intel SSD, but is almost twice at fast, has 5 year warranty, is less than a 10th of the size and has good support software for optimizing performance and reliability. Because of its performance rating the temperature goes up so make sure the card gets properly vented though, or it will die.

Your Asus MB is the D8 model. right?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #77 on: January 12, 2017, 08:32:53 am »
I would be very wary of any Asus board in this segment of the market. I have the Z9PE-D8 WS (dual LGA2011 V1/V2) and it's flakey as hell, particularly with regards to inconsistent POSTing and needing to be RMA'd. I am not alone, there are pages and pages of tales of woe on this particular board and I can't believe that there's going to be any dramatic improvement with a newer board.

Asus are simply not experts at this end of the market.

Make sure that the BIOS is up to date before you purchase the board - it is not at all uncommon to receive a board with early BIOS that you simply can't update unless you have a compatible early version CPU.

Whatever RAM you get make sure it's on the QVL, the very first thing they'll tell you to do when it doesn't work is to check the RAM's on the QVL (after asking if you have the latest BIOS).

Check online to see if others have had success with your build, including GPU and even PSU: there are cases where even a PSU can have compatibility problems despite being adequately spec'd in terms of power requirements.

I'd recommend SuperMicro as a better bet over an Asus or other consumer grade manufacturer. This segment is SuperMicro's bread and butter, it's what they do, they don't do consumer grade boards.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #78 on: January 13, 2017, 07:39:50 am »
Yes, my board is the D8 model.

I was actually recommended earlier to not go with a supermicro board, that is why the asus came up.

This is the only supermicro board I have found that is compatible so far.

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

Do you think this is a better choice than the Asus board?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #79 on: January 13, 2017, 07:59:00 am »
I read a lot of articles regarding that older model ASUS, and I think the issues people had was more about not understanding settings than the board being bad. Asus boards have a lot of settings that can be played with, and one bit in the wrong place breaks the system. Another thing I would out is that because most MB owners were DIY builders, they had assembled it wrong or used to tight fit cooling blocks etc... Modern CPUs and MBs are very delicate pieces of machinery, and one has to be careful.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #80 on: January 14, 2017, 06:52:26 am »
slicendice: Thank you for the information about the Asus board.

Does anybody think the supermicro board I posted up in my last post is any good? Is it by any chance a better alternative?

By the way, that Samsung SSD does look a lot better and a lot faster. I think I'll be going with that one.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #81 on: January 14, 2017, 09:23:05 am »
You're welcome! :-)

About PSUs: I have not had the time to look into this closely, but the most essential thing to look for is that the PSU has enough rails to give enough current to each device plugged into the system.

GPU needs it's own dedicated rail, and MB with all it's RAM and CPUs needs another one. All other peripherals needs their own rail too. Depending on GPU (140-240W) the PSU must be able to output 15-25 AMP on that rail on 12V, for motherboard the same thing if going for a dual Xeon system (2 x 140W+), so best is if the MB has 2 connectors and the PSU has 2 power cables that has their own rails with enough power to keep the device alive without losing voltage stability. The reason why this is very important not to put a too small PSU in the system is that, when the system POST/BOOT, all the devices will draw near MAX current for a brief second. If the PSU is too small, the device won't BOOT or even worse, either the PSU can blow up or some of the devices can blow up. We don't want that. And of course the same apply on heavy load while rendering something.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #82 on: January 14, 2017, 10:10:14 am »
What are you talking about?

Today modern GPUs has their own Power connector because they draw so much current. This/these cable(s) has it's/their own power rail in the PSU. The MB would blow up if such current would go through the MB. On a 4-way SLI configuration that could be equal to 80-100A on 5v/12V. I am not talking about 2 or more physical power supplies. Modern PSUs has at least 4 power rails that are completely separate from each other. Some can even combine 2 or more rails into one for even higher currents for one connector when needed. For some PSUs you even have a USB connector you can connect to the MB and then use software to configure and measure these rails on the fly.

We are NOT talking about 2-4 redundancy PSU configurations here. Don't over complicate things with stuff like redundancy that is something completely different.

I've built hundreds(if not even thousands, lost the count a long time ago) of consumer gaming PC builds, and the things I mentioned are very important to understand and take into account when building a stable computer that consume a lot of power.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #83 on: January 14, 2017, 10:16:54 am »
Anyway, my point is that DON'T try to save on the PSU and make sure each rail can give enough current for each device. and make sure each GPU has it's own separate Power rail or you will get an unstable system and most likely blow things up.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #84 on: January 14, 2017, 10:33:56 am »
HERE is a basic tutorial how to choose a PSU for your particular build. It is best to read the full article of how to build a PC from start to finish (not only the PSU part but going from basic idea to measuring and choosing the right parts).
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #86 on: January 14, 2017, 04:22:32 pm »
I read a lot of articles regarding that older model ASUS, and I think the issues people had was more about not understanding settings than the board being bad. Asus boards have a lot of settings that can be played with, and one bit in the wrong place breaks the system. Another thing I would out is that because most MB owners were DIY builders, they had assembled it wrong or used to tight fit cooling blocks etc... Modern CPUs and MBs are very delicate pieces of machinery, and one has to be careful.

No it is that bad. I have very recent first hand experience of it, and it IS flakey as hell, compared to say, the X9DRi-LN4F board I'm also using. The biggest problem is its inconsistent POSTing and lack of clear documentation around the settings and POST codes.

Like I say, check real board and build reviews from real users, be wary of board reviews from the usual suspects like Anandtech off the bat: as professional reviewers they will have had the OEM bending over backwards to provide support. They are also not going to be long term users. Newegg for example has board reviews from real people typically with pros and cons which gives a good insight as to how a board will work in a real environment.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #87 on: January 14, 2017, 04:45:52 pm »
PSU reviews in a nutshell

The problem with these reviews are that they are practically useless except for determining functionality and specifications when new.  That is a good place to start but it is not enough.  How do they perform at the end of their warranty period?  What is their design lifetime?  What is their real failure rate?  I would even settle for a list of warranty durations but I have to compile it unit by unit.

Power supply manufacturers have become adept at derating the output capacitors so that the design lifetime is not much longer than the warranty period so in some cases, they are practically the same.  Users end up renting power supplies rather than buying them.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #88 on: January 14, 2017, 05:47:57 pm »
I don't think the problem with CAPS dying has anything to do with bad CAPS, the heat or under sizing the PSU is the biggest problem. Many PC builders rely too much on the PSU cooling fan or they have a bad overall cooling design for their case --> the PSU will die very fast and so will some MB CAPS too.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #89 on: January 15, 2017, 06:26:27 am »
https://www.amazon.com/Corsair-RM1000x-Modular-Supply-Certified/dp/B015YEI7LK

I researched good PSUs, and how they work. Corsair seems to be a very good choice, and that is the model I wound up looking into.

Of course, I need a board that will work with that. I'm trying to figure out if the Asus board will work, or the Supermicro board would work?

From what I understand, this Supermicro board will work with all of my other components so far, right?

https://www.supermicro.nl/products/motherboard/Xeon/C600/X10DRX.cfm

Thank you to everyone again.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #90 on: January 15, 2017, 01:44:31 pm »
I run the HX1000i in my Asus build which is slightly more efficient than the RM1000x being Platinum vs Gold on the 80Plus rating. Both have a 10 year warranty and boast "Japanese" caps but don't specify the brand.

It works fine, but I found a couple of the cable runs in my build with the supplied modular cables were tighter than I'd like, it was marginal but it works. It depends on your particular enclosure, motherboard and how you manage your cables as much as anything.

One thing to check with your enclosure is whether it supports directly your motherboard, frequently for EATX and EEB sized boards you need to take care that no blind mounting hole standoffs are present or your board risks shorting out. You're also likely to need to drill and place your own standoffs in some instances. EATX does not share the same mounting holes as EEB despite them being the same board dimensions.

The biggest problem with self-builds at this end of the market is that the "gene pool" of builders rapidly dinminishes, so the characterisation of them isn't as wide as consumer boards. In your average server farm, although there'll be hundreds of servers, they tend to work on a very limited number of pre-certified and tested platform configurations.

I'd be very reluctant to state categorically that a given build will work faultlessly without having had direct experience of it. If you follow the board vendor's QVLs then you will, at least, have some form of comeback, and with that I'd put more trust in SuperMicro's support than in Asus for this type of board. You only have to look at a few of the Linus Tech Tips videos on their rendering machine a year or so ago to see that even they had an enormous amount of difficulty getting their boards going.

I would also check that any form of multi GPU environment that you're going to use is supported.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #91 on: January 15, 2017, 08:22:30 pm »
Something to consider regarding choice of CPU, price and GPU assisted video encoding.

 

Offline DimitriP

  • Super Contributor
  • ***
  • Posts: 1307
  • Country: us
  • "Best practices" are best not practiced.© Dimitri
Re: Proofreading my computer design and building a server rack station.
« Reply #92 on: January 15, 2017, 10:33:10 pm »
Something to consider regarding choice of CPU, price and GPU assisted video encoding.



This in not news to some :) 
I always sort CPUs by Speed first, then cores then price.
Let's call it "an intuitive assumption", and leave it at that.

   If three 100  Ohm resistors are connected in parallel, and in series with a 200 Ohm resistor, how many resistors do you have? 
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #93 on: January 16, 2017, 07:08:42 am »
Thank you for that video and the information about the CPU speed and the CPU cores.

Most Adobe CS6 programs support multi-threading, so additional cores should still be useful. Is the jump from a CPU with 2.2GHz with 10 cores a big difference compared to a CPU with 3.0GHz with 4 cores? Especially in the situation of using programs such as Adobe Premiere, After Effects, Photoshop, and Illustrator?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #94 on: January 16, 2017, 07:33:48 am »
That's the thing, when using GPU assisted encoding, the CPU will most likely just throttle, and if on a dual CPU board the added complexity in the system can slow things up significantly. So yes a 4-10 Core CPU can outperform a dual 22-Core design in GPU assisted encoding with ease. But if running CPU only encoding, more cores is generally better. Adobe software has it's limits though, and that is what we talked about in the beginning of this thread. A 10-Core CPU or less with hyper threading is more than enough for Adobe software. What is more important is memory timings(low latency is more important than high frequency), operation frequency of CPU and bus bandwidth.

Another thing that also makes a huge difference is in which slot each add-on card is plugged in. It is common to dedicate second CPU for the GPU, using the PCIe slot that belongs to that CPU.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #95 on: January 17, 2017, 03:31:15 am »
Will the Quadro M4000, and a 3.0GHz 4 core Xeon Processor Serve me possibly better or equally to a 10 core Xeon processor? Especially with your above method?

If so, can someone help me find a cheaper 3.0GHz Xeon Processor?

The 10 core that I was looking at was clocked at 2.2 GHz, and cost only $600...

https://www.bhphotovideo.com/bnh/controller/home?O=&sku=1226971&gclid=CNXuz87KudECFctMDQodA64ANQ&is=REG&ap=y&m=Y&c3api=1876%2C92051678882%2C&A=details&Q=
 

Offline DimitriP

  • Super Contributor
  • ***
  • Posts: 1307
  • Country: us
  • "Best practices" are best not practiced.© Dimitri
   If three 100  Ohm resistors are connected in parallel, and in series with a 200 Ohm resistor, how many resistors do you have? 
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #97 on: January 17, 2017, 05:31:50 am »
^Thank you very much, that looks pretty good.

Just so I have the knowledge, could you link me to that CPU's Dual/2600 v4 equivalent? Just in the case I want to add more cores in the future, as Adobe Premiere and After Effects can make use of up to ten cores.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #98 on: January 17, 2017, 07:10:47 am »
After doing some more research myself. I have come to the conclusion that a high end $5000 Quadro (as was suggested to me earlier in this thread) will probably not be greatly utilized for my purposes. At least not opposed to getting a 980 ti or a Titan X.

Now, for the purposes of heavily GPU reliant processes such as video rendering and 3D modeling, do you think that the Titan X 12 GB will serve me better, or just go to waste, when I could get the 980 ti's 6GB for 1/3 the price?

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980-ti/specifications

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-x/specifications

Also, will a Xeon processor really serve me better for the above two processes? Could I get an i7 with 4 cores and 4,0 GHz clock speed, or get an i7 with 6 cores with a clock speed of 3.3 GHz, and get a faster machine for less?

From what I understand, GPU accelerated video rendering peaks at about 6 cores, and for my uses, would 4.0 GHz with 4 cores do better, or would a 6 core with 3.3 GHz do better?

Adobe Creative Suite 6 programs can indeed utilize up to 10 cores, and hyper threading is possible, but would singled out faster clock speeds per each CPU still be more worthwhile than a Xeon with more cores and a slower clock speed?

Thank you all for your continued assistance.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #99 on: January 17, 2017, 07:21:54 am »
I am pretty certain a high end i7 CPU with a GTX 1080 will be more than enough for years to come. You don't really need that Xeon as it just makes the price go up a lot. Titan X is expensive so GTX 1080 is a better choice, it's faster and cheaper.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #100 on: January 17, 2017, 07:35:34 am »
I'll look into the 1080, but from what I understand the 780 ti and Titan X should be faster?

Just to be certain, I will not wind up suffering longer renders, or lagging video playback while video editing if I get an i7 over the Xeon? I do understand that the Xeon will NOT help me with video encoding whatsoever if I'm using the GPU, but will the loss of cores possible (although sacrificing clock speed) hinder the actual video editing processes?

I think, as far as my workflow is concerned, the actual video editing, and the After Effects editing is the part I must increase the speed of the most urgently. Previewing anything is absolutely impossible, literally.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #101 on: January 17, 2017, 07:49:48 am »
No the 1080 is faster than all of those, except for the Titan X Pascal which is the fastest GPU you can find today. GTX 1080 is based on the Pascal architecture.

Check THIS site out and build your own "gaming rig" (good baseline for anything else too). This might help you get the idea for what to look for.

CPU, GPU, RAM, SSD, HDD and MB.

About Adobe tools, I don't have any personal experience for the latest tools and hardware, so can't be much of a help there. We need someone who has actually built a beast for video editing.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #102 on: January 17, 2017, 08:23:29 am »
I was actually thinking of building two separate PCs, rather than one single PC to do all of my tasks. It seems to make more sense.

I remembered that I can't use the i7 for my music production, as it doesn't support ECC memory, and I believe that the ones that actually do, only support up to 64 GB.

If I built one PC around the i7 4.0 GHz with a powerful GPU that I could use for my video editing and video encoding/rendering specifically, and use the 64 GB non-ECC RAM that is available, I could work around the need for multiple cores in my other "studio computer", so to speak.

Then I would build another PC built around a dual Xeon build specifically for my music workstation where I could use 128 GB ECC RAM for any music processing and editing where ECC is more efficient. Also being able to use this PC for any possible small server needs in the future.

I'm building a massive case for this computer, so there would be plenty of room, and separate systems could indeed speed up the workflow as a video render would not effect anything on the second computer where I could multitask my music creation process.

The first PC would need a good GPU for rendering/encoding, and could easily pass with 4-6 cores at 3.6GHz to 4.4GHz Intel i7.

The second PC could focus more on ECC memory and multiple cores being used with hyperthreading for a music workstation. Even with 2.2GHz, in a music station, I think the possibility of a 20-22 core system would be very sufficient.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #103 on: January 17, 2017, 08:53:03 pm »
I used PCpartpicker to create two separate builds. One build based around a cual CPU Xeon build for my musical workstation, and a build based around a 4.2 GHz intel i7 CPU for my video editing and video encoding. I think it's quite possible that doing two separate builds for the different needs of this business would be much better for my budget than building a system that can do everything but is limited in certain areas.

The price comes out to around $9000 if I include taxes and the cost to build my customized case for everything.

I think this could be a good idea, but if someone disagrees, please point it out, and why you think one PC build would still be better.


Intel Xeon build...

https://pcpartpicker.com/list/wGkFYr

Intel i7 build...

https://pcpartpicker.com/list/MDJY2R


 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #104 on: January 17, 2017, 09:35:57 pm »
Thinking about it further, and doing continued research, one PC might be enough after all, if I really invest a lot into it. I have two more separate builds, with the only difference between them being the processor.

In that case, my last few posts can be completely ignored unless there would be something useful to add back there...

Question 1: In the more expensive build, would those two dual processors be worth the total price tag of $10,000-$11,000 of a computer build over the $7,000-$8,000 computer build with lower grade CPU processors instead.

Question 2: Is the 1080 GPU really faster for my specific needs (as I am not focused on gaming speeds) than the 980 Ti or the Titan X? The price tag must have some sort of meaning, at least between these, as it should be a difference between consumers and business for this price range.

I see that the memory bandwidth is higher on both the 980 Ti and the Titan X. Will that support my build better for it's purpose, or is it an insignificant difference showing little improvement over one another?

Most expensive build... $10,000-$11,000 including taxes, and my customized case.

https://pcpartpicker.com/list/XVbqwV

Second most expensive build... $7,000-$8,000 including taxes, and my customized case.

https://pcpartpicker.com/list/wzWxVY
 

Offline DimitriP

  • Super Contributor
  • ***
  • Posts: 1307
  • Country: us
  • "Best practices" are best not practiced.© Dimitri
Re: Proofreading my computer design and building a server rack station.
« Reply #105 on: January 18, 2017, 12:06:14 am »
probably  :horse: but here it goes.

The first system has too many cores cores @ 3.00 GHz
the second has even more cores @ 2.2GHz

as far as CPUs go, you need less cores, more GHz
the right balance to me seems to be a four or six core CPU running at 3.6 or higher.
Anything running @2.2 will be dog slow unless you are running the one specific application task that knows what to do with this many cores.
10 cores @ 3.0 are worse than  than 6 cores @ 3.6


all kinds of stuff to read here: 
old  https://www.pugetsystems.com/labs/articles/Adobe-Premiere-Pro-CS6-GPU-Acceleration-162/
and newer  https://www.pugetsystems.com/labs/articles/Premiere-Pro-2017-Intel-Core-i7-7700K-i5-7600K-Performance-884/

   If three 100  Ohm resistors are connected in parallel, and in series with a 200 Ohm resistor, how many resistors do you have? 
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #106 on: January 18, 2017, 12:44:20 am »
With my specific applications, they actually do take advantage of multiple cores, especially the music programs such as Reaper 5 and Omnisphere 2, which is why I need more cores, and the clock speed/GHz is mainly for rendering/encoding, so that's the balance I'm trying to strike here.

Is there a Xeon processor in the 2600 family (for dual cores) that runs at 3.6GHz or higher, with at least 4 cores? I could at least push that up to eight with a dual system, but I believe the more cores that I have, the better for the music production side of things.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #107 on: January 18, 2017, 06:46:27 am »
Quote
Here is a list of all E5 v4 processors that has 4+ cores and 3.5GHz+ turbo boost frequency.
CPU part number is simplified, full part number is "E5-"+listed_part_number+"v4".
Core count, base frequency and maximum (single/dual core) turbo boost frequency are listed.

3637, 4C, 3.5GHz, 3.7GHz, $996
2643, 6C, 3.4GHz, 3.7GHz, $1552
2667, 8C, 3.2GHz, 3.6GHz, $2057
2689A, 8C, 3.4GHz, 3.6GHz, OEM ($1349 from China gray market)
2689, 10C, 3.1GHz, 3.8GHz, $2773
2687W, 12C, 3.0GHz, 3.5GHz, $2141
2690, 14C, 2.6GHz, 3.5GHz, $2090
2697A, 16C, 2.6GHz, 3.6GHz, $2891
2697, 18C, 2.3GHz, 3.6GHz, $2702
2673, 20C, 2.3GHz, 3.6GHz, OEM ($1050 from China gray market)
2698, 20C, 2.2GHz, 3.6GHz, $3226
2696, 22C, 2.2GHz, 3.7GHz, OEM ($1699 from China gray market)
2699, 22C, 2.2GHz, 3.6GHz, $4115
2699A, 22C, 2.4GHz, 3.6GHz, $4938
2699R, 22C, 2.2GHz, 3.6GHz, $4560

Thank you so much, that is an extremely helpful list! It really simplifies my choice, so thank you for taking the time to put it together for me. ;- )

Quote
I'm currently running an E5-2696v4, at the time I bought it, it costed my $1929.

Does it run really fast for you? Theoretically, for me, that could be a very good compromise between my different workloads.

If I put 2 in there and received 36 cores, then my music production and 3D modeling/rendering would have plenty of cores to spread over. As for the video editing and rendering, if set to only utilize 2-4 of those cores, would be capable of turbo-boosting (at least close) to it's maximum 3.7GHz, do you think? Especially since I will be aided by GPU acceleration from a high-end graphics card, multiple cores (in fact, more than 6) will become useless, so that high clock speed would become extremely useful in that working condition.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #108 on: January 18, 2017, 07:35:01 am »
To get a complete database including features and limitations of all Intel CPUs go to Intel product specifications page. There you will find all you need to know about each CPU in a nice table format. I consult this database often and it is the only reliable source for detailed information on Intel CPUs.
 

Online MK14

  • Super Contributor
  • ***
  • Posts: 4539
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #109 on: January 18, 2017, 09:11:09 am »
This (older than I'd like) article, seems to carefully compare the difference(s). Between normal cpus, such as quad core cpus and higher core count, dual processor setups, for Audio work. By using what they think are reliable benchmark(s), over their many, many years of experience.

It seems the dual cpu system, wins hands down. Performance wise for their audio stuff.

http://www.reyniersaudio.com/blog/recording-computer/recording-computer-cpu-benchmarks-sandy-bridge-nehalem-and-bulldozer-processors-compared

I DON'T know enough about your audio work (OP) and the articles. To know how relevant it is.

Despite the age of the article, I suspect it's still relevant.
« Last Edit: January 18, 2017, 09:13:39 am by MK14 »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #110 on: January 18, 2017, 09:54:45 am »
ARK only contains info about non-OEM chips, which are usually much more expensive than OEM parts.

What are you talking about? Every single CPU type is there since the beginning of the database. It's not a webshop for non-OEM CPUs, it's meant for the user to find the appropriate CPU-type for their particular needs. It even gives you info for all available packaging types. It also includes the possibility to compare CPUs side by side. The price you see there is just a reference price and has nothing to do with real prices so it can be completely ignored. And most CPUs there don't even have a price.

What a load of BS!

P.S. Sorry for my rant, but I get furious when I see irrelevant post like this.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #111 on: January 18, 2017, 10:55:24 am »
ARK only contains info about non-OEM chips, which are usually much more expensive than OEM parts.

What are you talking about? Every single CPU type is there since the beginning of the database.


Hmm, I'm not sure if there is some misunderstanding here, but I agree with Blueskull, OEM-only SKUs are generally missing from ARK.

For example I run a pair of E5-2696 v2 on one of my dual Xeons, that processor's not on ARK but it appears on benchmark sites.

 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #112 on: January 18, 2017, 12:21:37 pm »
Most of those are "unknown" custom built CPUs made on order and should not be considered by mere mortals. Difficult to find, and you never know what you will get. No wonder people find so many CPUs that are incompatible with certain MBs.

Most of those CPUs listed above I find on ark anyways and a new Xeon versions takes a long time to appear in ark. There are a lot of v4 that are coming but are not yet officially released and there are a lot of CPUs that are not meant for the general public, so why even advertise such devices?

But now we are completely off topic and this conversation does not help to build a good PC.

First we look at usage scenario, then software/hardware requirements, then we look at what is available, after that what is compatible with what and after that we look at price and where we have to make compromises in regards to price/features. The rest blabber is just confusing and keeps us unfocused ending up in that the PC will never be built or there will be HUGE mistakes made.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #113 on: January 18, 2017, 01:09:38 pm »
Most of those are "unknown" custom built CPUs ......

OK, so Blueskull was right then, OEM-only don't appear on ARK. Just sayin'.

Quote
But now we are completely off topic and this conversation does not help to build a good PC.

Hmm, convenient, the only person making an issue out of it to begin with was... ;-)
« Last Edit: January 18, 2017, 01:12:16 pm by Howardlong »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #114 on: January 18, 2017, 03:46:56 pm »
My advice to any PC builder. Unless you are a system builder and know your CPU's like your own pockets, and you can not find the CPU in ark.intel.com, then don't buy it. There is a reason why such CPUs are cheap and left out from the list.

For instance that E5-2696 v2 Xeon has only 11 cores, when the official count should be 12. That CPU is the same chip as 2695 and 2697 but with different configuration. God knows what other defects it might have contained when going through specs tests.

I'd rather not find out the defects once they are already installed in the system, though I'm certain Intel made sure it will work before selling it.

We are trying to get a PC built that is meant for serious work, not some hobby PC where we rebuild the thing every now and then or experiment with different chips.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #115 on: January 18, 2017, 04:10:37 pm »
For instance that E5-2696 v2 Xeon has only 11 cores, when the official count should be 12. That CPU is the same chip as 2695 and 2697 but with different configuration. God knows what other defects it might have contained when going through specs tests.

It's actually a defect with the motherboard's BIOS, that Asus that I mentioned where you suggested that people didn't know what they doing (other than you apparently). It won't boot with both CPUs installed and all cores enabled, it does when only 11 per socket are enabled. Individually, both processors work with all 12 cores. Just a friendly suggestion, please keep in mind when commenting that I have the first hand knowledge of these devices, whereas it appears you don't, your comments are based on guesswork.

I actually agree with you in that I wouldn't recommend these devices to the OP, but that wasn't the point of my post with the screenshot of the device in CPU-Z as you well know, it was to show that your comments about ARK were incorrect ("Every single CPU type is there"). You know that, but for some reason have chosen to take it out of context, apparently to use it as a smoke screen that you might've got something wrong.

I also agree that we should get on with discussing the OP's concerns.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #116 on: January 18, 2017, 05:47:45 pm »
Howard, your CPU is still not supported by Asus. Don't go and blame the board, manufacturer or it's BIOS for something that isn't even intended to work in the first place.

After 20 years and hundreds of PC builds, I can confidently say that each PC has worked flawlessly unless there has actually been a hardware defect (which isn't that uncommon). Going and experimenting with unsupported stuff when I want something that works well out of the box is the last thing I would ever do.

I have also built PCs that has run CPUs that are not even supposed to work at all on such hardware(chipsets or even CPU sockets). After a lot of resoldering and diagnostics, these experiments has also run flawlessly with speeds way above specs.

One example would be Abit BP6 with 2 x 1GHz+ Piii CPUs. The Chipset i440BX was actually designed for single 1st gen Celeron CPUs with FSB 66MHz. Abit made the board run with 2 Celerons @ same FSB.
I made that same board run with 2 Piii @ 133MHz FSB. And darn it was fast. That board has now retired.

But I guess my experience is not enough, so I shut up now and leave this thread. Good luck with the build.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #117 on: January 18, 2017, 05:59:17 pm »
After 20 years and hundreds of PC builds, I can confidently say that each PC has worked flawlessly

Good for you. Come make this Supermicro board advertise turbo speeds above base clock instead of below.

In reality, there are bugs. Everything has them. To pretend they don't exist because you aren't following instructions to the letter is ridiculous.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #118 on: January 18, 2017, 06:30:15 pm »
Howard, your CPU is still not supported by Asus. Don't go and blame the board, manufacturer or it's BIOS for something that isn't even intended to work in the first place.

You've forgotten that I had similar difficulties on this board with supported non-OEM devices including pairs of kosher E5-2670s (and E5-2609s as a matter of fact).

Quote
After 20 years and hundreds of PC builds, I can confidently say that each PC has worked flawlessly unless there has actually been a hardware defect (which isn't that uncommon). Going and experimenting with unsupported stuff when I want something that works well out of the box is the last thing I would ever do.

Again, I am not recommending going the route of unsupported devices to the OP, how many times have I recommended only going for what's on the QVLs on this thread? You've missed the point completely, I was pointing out to you that your assertion regarding ARK and OEM-only devices was incorrect, and gave evidence of that.

Quote
But I guess my experience is not enough, so I shut up now and leave this thread. Good luck with the build.

In fact, your experience is appreciated, I largely concur with it, but sometimes we all have to graciously accept sometimes we are guilty of minor inaccuracies. It's not a problem as long as you're comfortable with accepting that.

 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #119 on: January 19, 2017, 01:08:12 am »
Quote
Yes, it's blazing fast for my needs. I run supercomputing on it, specifically running my home brew finite element simulation software on it for my research.
Since I use linked table data structure to represent sparse data, I won't get any benefit from GPU. Now I'm running my code on my Xeon E5, and I plan to build another Xeon Phi 7210 HPC for my expanded need.

Quote
It goes to 3.5GHz when 3 or 4 cores are used, and 3.7GHz only when 1 or 2 cores are used. When all 22 cores are used, it goes only to 2.8GHz (or 2.9GHz, I forgot the exact number), lower if AVX is used.

That sounds great in my opinion. Thank you for the information!

I think it should be compatible with my board? It's the Supermicro MBD X10-DAX E-ATX. It's compatible with 2699 v3 model, which seems to be mostly identical to the 2696 (I think)?

If that is the case, then why would I choose a different processor? I'm willing to work a little to get it to start working, I mean this to be a long-term investment in a good computer that will work precisely for my needs, and last a while. Those specifications sound like the kind of processor I need.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #120 on: January 19, 2017, 06:16:27 am »
+1.

I think I mentioned making sure the BIOS is already updated earlier. This happened to me recently on an Asrock X99 ITX board when fitting a 6800k. I had to pinch a 5820k out of another build just to update the BIOS.

Considering many of the server level boards have BMC, I'm not sure why they can't come up with a way of updating the BIOS through that.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #121 on: January 19, 2017, 06:34:29 am »
Asus Flashback method can update BIOS without CPUs and any RAM inserted. Only needs a BIOS binary, a powersupply and an USB thumbdrive formatted to FAT32.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #122 on: January 19, 2017, 06:42:13 am »
Asus Flashback method can update BIOS without CPUs and any RAM inserted. Only needs a BIOS binary, a powersupply and an USB thumbdrive formatted to FAT32.

Interesting, I will re-read up on that, I thought you still needed a POSTing system for Flashback to work.

Also, it looks like Supermicro have an OOB system called SUM to manage BIOS updates remotely, but it's not clear if it needs a POSTed machine to work.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #123 on: January 19, 2017, 07:11:49 am »
It's an Asus exclusive feature on latest boards. Don't think many others have this feature yet. With the thumb drive inserted in the correct USB slot and pressing the flashback button while computer is turned off, it automatically starts the update process, skipping the POST process.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #124 on: January 19, 2017, 07:14:43 am »
Some Asus boards also have a feature where you can copy a log of all status/diagnostics codes to a thumb drive. This can be done with MB online or offline. Requires different USB port and pressing another button on the MB
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #125 on: January 19, 2017, 08:34:05 am »
Quote
The particular MB supports up to 160W E5v4, but it requires v2.0+ BIOS. If you are buying one, make sure to let the seller to flash latest BIOS (2.0b). If running a previous version of BIOS, it may not boot, or may even fry the new CPU.
You can also buy a cheapest E5v3, even ES version, from eBay, and use it as your "bootstrapping" CPU to flash your new MB. A bonus point is that a spare CPU can be handy when diagnosing things.
When I was troubleshooting my system (AsRock X99+E5 2696v3, not v4, I upgraded to v4 later), a BIOS defect makes the system not to boot. This is not fixed until the next BIOS release.
I live in Raleigh, NC, what surprises me is all nearby computer shops, including bestbuy, intrex and office depot, didn't carry any x99 or lga2011-3 for diagnosis, not even for sale. So having your own diagnosis CPU for merely $100 isn't a bad deal, especially if you want to build more 2011-3 system , for yourself or for a friend. If you don't need it anymore, just sell it, you can sell it at almost the same price. Your can forget this advice if you live in a big city like NYC, where you can get diagnosis service for every platform easily.

Thank you again for the advice. I will make certain that is done. The last thing I want is to fry my $1000+ CPUs.

$100 seems like a great deal to me, so if for some reason the seller can't update the BIOS for me, I'll try that instead, it makes sense to me.

That was the board I was looking into, and I think it is a pretty good one, and it's a Supermicro board. Unless someone has a suggestion based on their experience, that would handle the CPUs better on a dual system?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #126 on: January 19, 2017, 08:53:47 am »
Supermicro is the safest bet, it should run stable out of the box. You will not get some fancy features some other boards have. But you don't really need those features anyways, unless you are a computer enthusiast.

Someone with hands on experience for that particular board should help you get things setup properly. Saves you a lot of headaches if there exists some "gotchas" regarding CPU, RAM and GPU.

Best place to ask for help and further advice regarding a specific motherboard would be a forum where people talk specifically about that particular board. Some owners thread would be great, as there you will find out what issues people are having and how to solve them.

It's hard work these days, 20 years ago it was simple as the hardware was quite simple and a soldering iron could fix most issues. But ususally whatever board you purchased worked out of the box with pretty much any components attached to it. Today hardware has become quite complex. ;D
 
The following users thanked this post: 3db

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #127 on: January 19, 2017, 09:06:46 am »
Thank you for your advice, that is what I was thinking about using supermicro. The Supermicroboard I posted, I read on an ebay listing that it wasn't compatible with the Xeon E5 2696 for some reason, but that's not true, right? The BIOS update should fix that, I think?

This is the only other board I've found, which had much better reviews than the D8...

https://www.amazon.com/Z10PE-D16-WS-LGA2011-v3-CrossFireX-Motherboard/dp/B00QC5DZEU/ref=cm_cr_arp_d_product_top?ie=UTF8

Although, what worries me here, is in two of the reviews I've read that E5 2699 v4's were not working with that board for some reason? Which that CPU should be identical to the E5 2696 if I'm not mistaken?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #128 on: January 19, 2017, 09:19:15 am »
According to ASUS, the E5-2699v3 and v4 should work on the latest D16 board. It is important to select the right version of the CPUs depending on BIOS. A v3 is not the same as v4. V4 requires BIOS v3204 while V3 runs on BIOS v0501. That's a HUGE jump in versions. So it seems there has been a lot of issues on this board too.

As I said before, Supermicro might be the safer bet, as their BIOS is more mature and is a lot simpler, without all the fancy stuff ASUS includes.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #129 on: January 19, 2017, 06:50:28 pm »
blueskull: While deciding between the boards that I'm looking at right now, I forgot to ask what version is the CPU you're using? Is it a v4,v3,v2?

I still agree that supermicro might be a better choice, but I'm looking at the ASUS to see if it might have any features that I could possibly need down the line, or would be more useful, just in case.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #130 on: January 20, 2017, 02:06:19 am »
Thank you!

I'm starting to get a better look at what kind of build I will be making at this point. I will post my newly refined build with a link to PCpartpicker later on tonight, with a few more questions, mainly regarding cooling, power supplies, etc...
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #131 on: January 20, 2017, 08:43:35 am »
This is my current build...

https://pcpartpicker.com/list/kjywCy

You can replace the dual Xeons with the E5 2496 v4, as I think that is what I am going to get.

I think I will go with that board, but I'm also looking at the ASUS 16 board as well.

I'm not sure about the coolers I picked? They are liquid coolers, but I don't know if that is a good idea, or I should go for normal fans. I have to for the CPUs in mind, but I'm not sure if I should be getting more for additional cooling for the graphics card, or anything else in general?

I have some additional questions about the power consumption, but after I finish looking it up myself, I will bring it back here.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #132 on: January 20, 2017, 09:22:54 am »
I'm a bit worried about the dual graphics card and the PCIe SSD for this build on this particular MB. It seems everything, including chipsets will be connected to CPU1, not sure that is a good idea. CPU2 will be mostly doing nothing when it comes to managing devices. All CPU PCIe lanes will be occupied on CPU1 by pretty much everything. I don't think you will get even close to max performance from the GPUs in this configuration.

Another thing I noticed is that the RAM is in the way of longer cards for the TOP PCIe slot. That could be an issue.

Maybe someone else has some additional insights to this.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #133 on: January 20, 2017, 07:12:59 pm »
1. I will look into a faster ram, although, from what I have heard, especially in my case, that speed increase isn't too much of an improvement.
2. and 3. Chassis does not matter, I am building my own tower/case for this build completely from scratch. That first post is completely outdated, and there is nothing relevant left in there, so it is not something to be concerned over.

About the power supply questions. I'm still looking into it, but this PC is going to consume 1000+ watts of power, and with the rest of my setup, that could cause power outages. I was thinking about installing another electric breaker in my house for it to run on, and I'm having a hard time finding pricing estimate (at least current ones), as I had done this before, but this was years and years  ago.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #134 on: January 20, 2017, 09:56:20 pm »
Quote
I'm a bit worried about the dual graphics card and the PCIe SSD for this build on this particular MB. It seems everything, including chipsets will be connected to CPU1, not sure that is a good idea. CPU2 will be mostly doing nothing when it comes to managing devices. All CPU PCIe lanes will be occupied on CPU1 by pretty much everything. I don't think you will get even close to max performance from the GPUs in this configuration.

Another thing I noticed is that the RAM is in the way of longer cards for the TOP PCIe slot. That could be an issue.

Maybe someone else has some additional insights to this.

The more I have been thinking about it (although it might be against some user recommendations), I think I might go with the ASUS board. I have heard a lot of good things about them (Linus uses them), and I think it might be a good overall choice.

Would I still face the same problem with that one, I am somewhat confused?
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #135 on: January 21, 2017, 05:33:03 am »
Quote
Both boards use all 80 PCIe lanes from both CPU. Each E5v3/v4 can only provide 40 lanes.
The Supermicro board has 10 slots of x8 lanes, and a x4 lane derive from C612 chipset.
The Asus board has 4 slots of x16 lanes, and 2 x8 lanes, both from CPU. In addition, it provides an x2 m2 slot from C612.

I was reading up on PCI-E/lanes/GPU, and I think I understand it somewhat better. I'm still somewhat confused, and I want to make sure I get this all right.

Nothing should be incompatible, and for example's sake, I will be going with the ASUS board. With that in mind, can somebody explain to me in detail exactly how this works, and the issues that I am facing with the graphics cards, lanes, and the CPU?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #136 on: January 21, 2017, 06:16:11 am »
Puget Systems did a reasoanble range of tests on x8 vs x16. https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/

One thing to consider are the logisitics, for example the Asus X9PE-D8 WS board I use has four x16 slots (2 per CPU) and supports up to four two slot GPUs. However you can't get a GPU with a back plate heat spreader in to the first slot because the DIMM slot tabs hit the board. You could hack the DIMM slots, but it's a design fail. Things like this regretfully you only find out at the build stage, unless you can find someone else who's also done your build and remembered to document the feature.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #137 on: January 21, 2017, 08:12:26 am »
NO, you don't need all GPUs on one CPU. Asus supports 4-way SLI over 2 CPUs/PCIe lanes over 2 CPUs.
NO you should not try to force a 2-way SLI in CPU1 if there is a lot of other hardware attached to that CPU while CPU2 has nothing else to do than pure computation. This was proven by LinusTech in one of his videos where he had build a Video rendering beast (can't find that particular one right now). But the end result was that when GPUs were connected to CPU1 it was slow like hell and after switching to CPU2 the speed improved significantly.

But don't take my word on this, go find the relevant videos and benchmarks yourself.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #138 on: January 21, 2017, 08:58:42 am »
Thank you both for explaining this further, it is very helpful, I appreciate it!

I'm looking for the Linus video right now. I will be back after I have read the above article and I have seen the videos.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #139 on: January 21, 2017, 09:32:09 am »
Try to find the followup on this PC build where he realizes his mistake, I try to see if I can find it too.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #140 on: January 21, 2017, 10:38:08 am »
One additional note about having GPUs connected to different CPUs is that though it works, there will be a performance penalty because the cards don't share same address space. SLI link should solve this at least a bit and provides additional bandwidth for GPU to GPU communication.

I read an article where some company builds Servers with 8 GPUs installed. The article explained quite well the relationship between PCIe lanes, CPUs and PCIe Root Complex and switching and how it all affects performance. Can't find the exact article right now as I just stumbled on it while looking at other stuff.

 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #141 on: January 21, 2017, 10:43:10 am »
Try to find the followup on this PC build where he realizes his mistake, I try to see if I can find it too.

Indeed, I watched that series a while ago. It also is not clear which board he ended up using. He started off with a Supermicro server board, then switched to an Asus. Both he had enormous difficulties with getting them to work with dual graphics cards as I remember, despite being in the phone to their techs for a very long time (a service that mere mortals won't get). Then I believe Supermicro sent him a workstation board where he did get the two GPUs working, although it's far from clear. One chassis we see is an enormous 6U+ rack-mount rig with 8 GPUs and dozens of SSDs but I don't know what that was based on.

Seeing him struggle with those boards was actually very worthwhile, because that is exactly the sort of nonsense everyone else has to put up with when doing a custom build with these lower volume less characterised high end boards. We just don't have a hotline to the motherboard vendor.

I go hot and cold on Linus. He's young and enthusiastic, but he's also quite naive. The recorded videos are generally pretty slick, but often lack depth, I assume he's trying to keep the infamous ten minute attention span. A lot of what he does technically has already been figured out by one of his team of minions, although he rarely mentions that, and it often passes off as his own work as a result. Sometimes on camera, on the live builds for example, he displays quite a nasty side to him with the way he talks to his coworkers if they are trying to fix something in the background.
« Last Edit: January 21, 2017, 10:47:19 am by Howardlong »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #142 on: January 21, 2017, 05:01:47 pm »
Oh no, I see now that I caused confusion by mistake. Yes, there is no known way today that would enable more than 4-way SLI. Yes in CUDA/OpenCL we think more in compute module and core terms so in theory it could scale infinitely. From the users perspective all SLI does is making 2+ cards look as they were only 1 big GPU.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #143 on: January 21, 2017, 07:49:45 pm »
I watched the video and the follow up video. I think I have figured out what I should know about PCIe lanes, and I see what everyone is saying.

Now, if each CPU runs 40 lanes, would it not be logical that I could run the x16, even with both cards in SLI on one CPU?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #144 on: January 21, 2017, 08:01:45 pm »
Can you point me to some documented benchmarks on running, say, two GPUs across two CPUs as opposed to running both GPUs on a single CPU?

It's not a leading question, I am genuinely interested.

My only direct practical experience of this is on SQL Server dealing with NUMA under VMWare (these days almost nothing's on physical) and to be honest in that respect I rarely have the opportunity to do real world performance analysis other than on my own setups and on live systems, and then dealing with the troubleshooting and consequences of mal-configured VMs. Unlike a decade+ ago, with the increasing use of SANs and VMs, customers rarely replicate their production configs on lower environments, making performance analysis far less deterministic.
« Last Edit: January 21, 2017, 08:08:03 pm by Howardlong »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #145 on: January 22, 2017, 01:13:13 am »
This was the closest thing I found regarding benchmarking of 2 CPU/2 GPU, but I don't think it's exactly what you were talking about...

http://www.ks.uiuc.edu/Research/namd/mailing_list/namd-l.2011-2012/2577.html

I'm perfectly okay with having to run them in any configuration I have to. I might just need the concept of the lanes further explained to me. I know the ASUS board has more than 1 x16 slot, and each CPU supports 40 lanes, so to me (with not too much understanding of the concept, I suppose) it sounds like the 40 lanes are enough to run them both at x16 on only 1 CPU, correct?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #146 on: January 22, 2017, 06:38:39 am »
I don't have any dual GPU machines here as I have little need for them in my work, but I do have two identical GTX 1070 cards here that I could do some benchmarks with to settle this. It would mean taking a tight ITX X99 build apart though, the 1070 needed some case modding to fit it in so taking it out again will take some time.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #147 on: January 22, 2017, 08:50:37 am »
Howard, I would really appreciate that, it would help out a lot. If it's too much trouble though, you don't have to worry about it. Obviously, computer builds are no simple task, and I don't want anyone putting themselves or their equipment through too much trouble.

I'll be back after a little while, looking into some more information on the CPU/GPU, but again, I wish to thank everyone here for taking so much time out in helping me with this build. It is really appreciated! ;- )
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #148 on: January 22, 2017, 10:03:48 am »
Not everything needs to run through the CPU. Most older i7 CPUs has only 16 lanes in total. This where the Root Complex and Switching comes into play. You can have for instance 32 lanes on the chip, and if the only thing it controls is 2 x16 PCIe ports it will switch full 16lanes data from GPU to GPU without having to bother the CPU. This is just an example. Sometimes when building more complex systems with very specific requirements, it can be a good idea to check out how the CPUs, the chipsets and all ports/plugs are wired together.

In your case, you are building a fast general purpose computer so winning(best performance) in one end might make you lose in the other. But it will be possible to find a good balance between the 2 extremes.

That Asus board can't be that bad, or Asus will run out of business pretty soon. If you decide to go for Asus, then make sure you pick also other Asus QVL parts as much as possible. If something for some unknown reason goes wrong, you can blame Asus and get your money back, especially if the parts were Asus own parts from QVL. Just as a hint. :-)
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #149 on: January 22, 2017, 01:16:49 pm »
I've been thinking about this computer build. I've been studying a lot of sites that has recommendations for Professional Audio production and also separately for Professional Video production.

All sites states that no more than 64GB RAM will ever be needed for any tasks in neither. Most even stick with only 16GB but in some cases 32GB is required, when doing some really heavy editing. If you are going to do some video editing at a similar level that Pixar does then you will need Quadro cards for added GRAM, but I'd say 8GB per GPU is more than enough.  Most built devices had a price tag way below $2000. Of course if you want to use SLI then for each GPU card added you can add almost $1000 more to the budget.

For Audio more cores is better, but never did I find any recommendations for tens of cores. Heck, most users did professional audio production on laptops, which are slow compared to desktops.

I'm starting to think this Dual Xeon build is a total overkill and a waste of money and it also introduces a lot of complexity to the system which in turn can make it difficult to get a stable and reliable computer build which in worst case scenario will perform way worse than a workstation with just an i7 CPU, 32GB RAM, SSD and a couple of GPUs.

For a workstation this server type of computer mentality is not required. But if you want to build a rendering server that does many rendering/encoding jobs at the same time, then we can talk server, but I'd still keep the actual workstation separate and somewhat simple. The server could also do other tasks like having a database, working as a NAS and storage for backups and archives etc...then we need more cores and more RAM, until then I think it's pointless.

But in the end you pay so you decide what is best for you.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #150 on: January 22, 2017, 02:48:56 pm »
FWIW here are some benchmarks. Please note that without having access to the software and workflow that's going to be used, these are simply indicative. Also please note that this was about 2.5 hours of effort, including accumulating the parts, so while I'm happy to try any other specific tests, it is reasonably time consuming ripping boards in and out and getting the machine to boot again, plus taking care that the right slots are used.

In all cases, I was using x16 slots that were working electrically at x16.

I didn't have an SLI HB bridge to hand so these tests are using a standard SLI bridge.

Machine is Z9PE-D8 WS with 2 x Xeon E5-2696v2 ES QDUF Bclocked to 105MHz, 2.94GHz turbo to 3.15GHz (restricted to 11 cores/socket), 64GB (8x8GB) DDR3 ECC RAM @ 1680MHz, 2 x MSI GTX 1070 Gaming, Corsair HX1000i PSU.

Code: [Select]
GPUs                      1xGPU    2xGPU     2xGPU     2xGPU     2xGPU
SLI                       N/A      Off       On        Off       On
PCIe x16 bus              N/A      Same CPU  Same CPU  Diff CPU  Diff CPU

Unigene Heaven  fps       41.7     41.1      73.4      41.2      52.5
Cinebench R15 OpenGL fps  89.04    88.95     90.68     91.28     89.04

Cinebench results strongly suggest it only uses one GPU.

Unigene Heaven has two interesting scores when SLI is on, 73.4 fps when both cards are on the same CPU but only 52.5 when each card is SLI'd across two CPUs. Again, please note that this is not necessarily representative of your workload, it is probably more appropriate to gamers.
 
The following users thanked this post: MK14, Lizzie_Jo_Computers_11

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #151 on: January 22, 2017, 02:53:42 pm »
I'm starting to think this Dual Xeon build is a total overkill and a waste of money and it also introduces a lot of complexity to the system which in turn can make it difficult to get a stable and reliable computer build which in worst case scenario will perform way worse than a workstation with just an i7 CPU, 32GB RAM, SSD and a couple of GPUs.

I have to agree, if it were just one job then it would be easier to come up with an optimised solution, but there are several somewhat conflicting requirements which means it'll have to be a compromise one way or another.

My only caveat to that is that the benefit of having a gazillion cores is that you can get on with other stuff without being hindered while a couple of renders are going on.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #152 on: January 22, 2017, 07:57:56 pm »
Howardlong: Thank you for all of the effort you put into that! I really appreciate it, and I know it must have been a lot of work on your part, and time consuming. That is more than enough, and definitely worth it, so thank you again.

It seemed X2 GPU/SLI on/Same CPU had the overall best results, with the fps being significantly
higher, from 40 to 70 in unigene, but there was never any significant case when the fps rose in
cinebench, in fact when the dual GPU was enabled in the first results, it actually dropped.

The results didn't show that big of a performance increase with the fps, but I believe there would be even greater improvement when video editing, specifically when scrubbing the timeline, and previewing edits. Adobe premiere can make great use of CUDA cores, so that is a given.

It should bring an even bigger impact when rendering in CAD programs like blender, which runs primarily on GPU acceleration, and in Unreal Engine 4 which will benefit from more GPU/CUDA cores. Which makes your results helpful either way, especially when it comes to this particular task.

slicendice: I actually do agree with your point (and Howard's as well) about the issues with balancing the different tasks at hand, which is why earlier I brought up the concept of different builds all in one massive tower, opposed to a general purpose machine, but I decided it might have been easier with a single machine, but if it's going to greatly impact everything, perhaps we should build several instead. One for each purpose?

slicendice, you're definitely right about the RAM/cores on the video editing machine leveling out at 64GB and giving less return after around 6 cores. Clock speed, it seems, would be most important.

Howardlong, I agree that a lot of what I was thinking with all of the cores was multitasking while renders are going on.

However, with separate builds, it wouldn't make a difference, I suppose.

What do you think? Would separate builds provide overall better experience with each set of tasks at hand. I think it would.

As a quick layout (although after hearing everyone's opinion on separate builds, I will be far more detailed in this description), I think this is what I could be looking at?

A video editing computer.
A rendering server-like computer.
A general purpose (though less intricate than what I have now) dual Xeon system.

One thing I have to look into, is if CAD programs run enough similarities to either a rendering server, or a video editing computer, so that I can figure out which between them I can build up based on their mutual specifications for a professional grade computer.

Thank you again everyone!
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #153 on: January 23, 2017, 08:23:32 pm »
From research, I have discovered that both video editing workstations and CAD workstations operate using essentially the same kind of specifications.

Although, in a video editing computer, the CPU is the most important piece.

In a CAD workstation, the GPU is the most important piece.

I think I might be looking a 2-3 separate computer builds?

I'll start with the video editing build (also, it's likely the CAD build).

It seems that in a video editing environment, that a high CPU clock speed is more important than multiple cores. The cores are useful to a point, but beyond 8-cores, there is usually a drop-off in performance. With that in mind, I have been looking at the i7 6900K and the i7 6950X. They clock beyond 3.7 GHZ with as many cores as I'll ever need for video editing/scrubbing the timeline.

Graphics cards would probably still work with 2 GTX 1080's, or to save a little bit, I could work with only one of them. As I said earlier on, the CUDA cores are utilized in Adobe Premiere, and opposed to a Quadro M4000, it has nearly double as many.

I've been looking into SSDs for the OS and the program files, and I think that Samsung 950 Pro really is the best choice. The mother board has an M.2 slot on it.

As for a motherboard, I think the Asus X99-Deluxe would be a good choice, but I'm still looking into other options also.

As for RAM, no more than 64GB of DDR4 will do me any good. 2400Mhz also seems like a safe bet, but I don't think I even need it to be that quick, as it will begin having negative returns.

I think that it's an overall good build for video editing.

In summary, this computer's main objective is...

Video editing in Adobe Premiere CS6 and Adobe After Effects CS6.
Specifically being able to move across the timeline quickly, and preview real-time video edits.

Secondarily...

Be used as a CAD workstation meant for 3D modeling, doing similar work as you would see in today's high-end videogames.
Again, being able to preview in real-time.
Development on Unreal Engine 4, with the same above requirements.

I do believe that the video editing, and the CAD workstation will be able to share a build without sacrificing on anything on either side, but if you see that there will have to be compromises, I can begin another build for the CAD instead.



The next computer is intended to specifically be a rendering computer. I would like to set up a network for this all to work on, but I have run into some problems, which I will explain, before going into the actual build.

The Adobe Premiere and Adobe After Effects programs don't support said networking? I'm still looking into it, but they don't have built in features to work across a server computer, I believe.

I'm not sure about the workarounds, but they need to work quicker than they would all on one machine.

As for the actual build.

I know that a lot of cores is good for video rendering. I suppose a dual Xeon build would be a good idea here. those 22-core CPUs, I suppose.

The graphics card is secondary? I don't think it will assist with rendering all too much, but this is where the second GTX 1080 could be used instead of doubling up on the editing build.

RAM is next, so I suppose between 32GB and 64GB will be more than enough. The same as the above build.

The same storage as above. The Samsung.

I would need a Dual Xeon Motherboard, and I suppose the earlier mentioned Z10PE D16 WS board will work.

I'm not too familiar with building servers, or a rendering system, but earlier, the two of you mentioned them, so I think you can help me.

In summary, this computers main objective is...


Rendering video projects, and Unreal Engine 4 projects quickly and efficiently.
The purpose is having the editing computer separate from the rendering computer. With this setup, a project can be finished the editing stage, sent to the rendering computer, and undergo the rendering process, while editing can continue with no slow down, on another project on the editing computer.

As a last note, I'm assuming it actually isn't a good idea that the server computer and the rendering computer are one in the same?



As for the last build, that takes me back to the Dual Xeon build we were working on.

This computer will be a general purpose computer for everything else, including music production, email, internet, etc.

The ability to have multiple processes running at once is the basic idea.

The E5 Xeon 2696/2699 V4 CPU will work for this purpose, even despite its low (base) clock speed, as its main purpose is the multiple core usage. I could work with just one, and upgrad to two of them later on.

The motherboard could be another ASUS Z10PE D16 WS, which I have already established is a working board for this build.

for RAM, even 32 GB of DDR4 should be more than enough.

The same Samsung storage again.

I could add a sound card, dedicated to the DAW programs.

Lastly, I could throw in another GTX 1080, as it will be more than enough for it's general purpose usage.

In summary, the objective of this computer is...

Working as a general purpose machine, for anything outside of video editing/rendering/UE4/server to run through.
Allowing the other builds to work independently without slowing anything down.
Music production, that will work quickly with Reaper 5 (DAW), and VSTs like Omnisphere 2.



In hindsight, I could be looking at up to four builds, but that is when I need everyone elses help.

I will keep researching, but all of your input would be appreciated in figuring out how many builds and how to give them each their own set of skills accurately.

Thank you, again!
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #154 on: January 24, 2017, 09:43:16 am »
Quadro cards are not faster than GeForce cards, that is a myth. Only 2 reasons to buy Quadro cards is the avoiding artifacts issue and support for more GDDR memory. In fact GeForce 1080 cards are a lot faster than Quadro M6000 and P6000, but Quadros have 24GB GDDR while Geforce only 8GB. And not to forget Geforce in 2-way SLI which boosts performance even further.

For Unreal 4, if you are creating your own assets, then you will need software like 3D Studio MAX or Maya and same GPU specs apply as for the other CAD software. For actual programming any device that can run the actual game will be good enough.

For video production, this same build can be used.

For rendering 3D assets and videos a 2nd computer could be great. That could actually be a server where you virtualize things like AD, SQL Server, Source Control including automated builds, rendering engine etc...BUT BE WARNED! HERE IS WHERE THE PRICE TAG WILL  GO UP A LOT! You will pay a lot for licences, Server licences for each CPU core, CALs for the servers and CALs for pretty much each and every software you add there. You will most likely pay at least as much in licence fees as the actual hardware build costs.


I would really start first with the workstation computer and make that one work well and after a while if the workflow has some bottlenecks because of just a single computer, then I'd go with a server as step 2.

I have an ancient laptop and an even more ancient desktop computer. I use the laptop for programming and I have even tested out how well it works with Unreal and Unity. No 4k games but for FHD it's fast enough. I have tried all Autodesk products on it and though the performance isn't that good, it's still good enough. The laptop has only 8GB RAM, SATA3 SSD, i5-2410M CPU and GF GT-540M card. If you look at those specs and compare to the devices you are going to stick into your build, they are 10-1000x faster each, depending on which component we are looking at. So please don't try to overkill this build. ;-)

The server part is for AD, SQL, Source Control and Storage. It only has 4GB RAM, 7200RPM HDDs and an ancient 64-bit, 4-core Pentium Core something CPU which does not even support virtualization. Everything is setup on the main OS (Server 2012 R2) which is actually a bad idea. :-) All works well and is fast enough for serving only me. But I'd prefer to have support for Virtualization, so I could use Hyper-V server as host, and then virtualize the AD, SQL, Source Control and NAS into separate VMs. I would need 8-cores and 16GB RAM for snappy performance. Thats it! So again, please don't overkill the server build part. :-)

For a third computer, you don't need it for anything else than backup storage, so keep it light. :-)

About GPU rendering vs CPU rendering, if it's true that Adobe software produce worse quality video on GPU than on CPU, then switch to some freeware rendering engine that does the job for you hundreds of times faster with lossless quality on GPU only. Adobe software used to really suck at this. Trust me they were terrible. I've tried them, tossed them in the bin and never looked back. I doubt they have improved that much. :-DD
« Last Edit: January 24, 2017, 09:53:17 am by slicendice »
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #155 on: January 24, 2017, 11:13:40 am »
Quote
6. If you insist on buying a single socket system now, or you are a big Intel fan (which I am), then I recommend you to buy a 6800k because it has the same spec, just with 28 lanes of PCIe, not 40 lanes. For a dual GPU configuration, 16+8 is more than enough. The rest 4 lanes is for NVMe SSD.

FWIW, if you intend to overclock, my experience is that the 6800k doesn't do very well at all compared to a 5820k which is the previous incarnation (both are LGA2011v3). All other things being equal, on the examples I have, the 6800k I can get stable to 4.2GHz with a rather high 1.45V Vcore, whereas the 5820k achieves 4.6GHz with little effort, and 4.7GHz with a bit of minor Vcore tweaking. The problem with overclocking on the 6800k doesn't appear to be temperature related, as it'll fail at reasonably low temps (~60C on liquid cooling).

If you're not overclocking, ignore this comment, as at stock speeds the 6800k will outperform the 5820k.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #156 on: January 24, 2017, 11:41:17 am »
I now have an HB SLI bridge, so can re-run a couple of tests, although I'm not expecting much improvement. Frustratingly it's a 60mm (3 slot) bridge which means for the board I'm using one of the two GPUs will have to be on an x8 PCIe slot (it's x8 and x16 on alternate slots), so I'll have to get a baseline in the x8/x16 configuration too. Hopefully the baseline will be similar.

Anyway, if there are some specific simple benchmarks I should run, let me know and I'll include them.

FWIW, here is my setup (pic is with a standard SLI bridge). I have used air cooling for this because Xeons tend to be easier to cool than overclocked processors, and, perhaps more importantly, I have more faith in air cooling than I do in liquid cooling in terms of reliability. The processors run at full load at 60C. The disk FWIW is a Samsung 950 Pro NVMe M.2 mounted on an x4 PCIe board on the second CPU.

Edit: the Unigene Heaven tests were run at 3840 x 2160 resolution.

« Last Edit: January 24, 2017, 12:22:17 pm by Howardlong »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #157 on: January 24, 2017, 07:54:57 pm »
Blueskull/slicendice: I watched this Linus video a little while ago...



The Quadro M6000 was indeed outperformed by the Titan X, and even further by x2 Titan X. He was going to do three, but then tripped the power supply on his build.

I was going to ask what possible benefit the Quadro would have over the Titan X or in my case (unless it would do worse) the 1080.

Running specifications on the M4000 vs the GTX 1080, the 1080 won out in basically all of its on paper specifications, which would line up with Linus' tests.

With that in mind, would getting a 1080, or if I really needed to, a Titan X really degrade the performance of the CAD operations? I intend to use, possibly, Blender/Maya for all of my 3D modeling, so with only a GTX 1080 will my model manipulation really slow down significantly?

As for the video editing, will getting a M4000 affect negatively the video editing experience? Slowing down the scrubbing timeline, or slowing down previewing, or would it improve?

In the Linus video above, it would appear that even the GTX 1080 could outperform the M4000.


slicendice: As far as the NAS is concerned, would building up a SAN instead cost less. Really, what I need is an editing computer and a rendering computer, that will both have access to the same source files.

If I build up an editing computer and a rendering computer, and run them through a home network system to a third "storage" computer, then they will both have access to the same files/projects/assets. I just need to run it all on a fast network.

I assume I could go with the bare minimum on a storage computer that the editing/rendering computers would have access to, right? A server board would be enough, or a board where I can (as time goes on) continue to add more and more SSD/HDD as storage where the editing/rendering computers will have access to files from. I don't think I will need GPU really, and is there any affect that more than a basic CPU, and little RAM would make? It's only meant for accessing files?


Now, in Adobe Premiere, CPU is the key most definitely. However, Premiere does use GPU acceleration on certain effects and transitions, as well as using the GPU to render these affects using GPU acceleration. In After Effects, GPU can be utilized further than in Premiere, since it is a graphics based program. I do assume I will be using After Effects a lot as well.

I think it looks like this...

CPU>GPU: Adobe Premiere Pro CS6
CPU=GPU: Adobe After Effects CS6

blueskull: Are there any other documented cases of the Samsung drives being prone to failure? I was originally going to go for the Intel 750 series because of bad past experience with Samsung products, but the 950 seemed to be the fastest possible, so I had changed my mind. I want my SSD to survive longer than a year or even less.

Howardlong: I probably will be overclocking, so thank you for that information. I was indeed hoping to get to a (stable) 4+Ghz out of my CPU, as that would really boost my performance, especially on the editing machine. I will look into that CPU

The only additional benchmark that I think I need would be a render time, but if that is too much, you don't have to worry about it.



What I think I'm looking at again, in quick summary... (unless it isn't looking good, let me know)


An editing computer... secondarily a CAD/3D modeling computer.

It needs to...

Move around the timeline quickly/seamlessly.
Real time previewing quickly/seamlessly.
Manipulate 3D models quickly/seamlessly.


A storage computer...

It needs to...

Store all Adobe Premiere Pro/After Effects files/projects for access on the editing computer and then the rendering computer.
Access of these files must run quickly, not be slower than on a single machine.


A rendering computer...

It needs to...

Render and encode high quality video files (between 1080p and 4K) quickly and seamlessly.
Be able to render/export high quality 3D models for Unreal Engine 4 quickly.
Be able to export Unreal Engine 4 completed projects.


A general purpose/music production/multi tasking computer...

This one I pretty much have figured out, it will wind up being the build we were putting together earlier with the ASUS board and the Dual Xeon E5-2696/2699 v4s.

Thank you again, for continuing to help me with these builds.



Does anyone think I should start a fresh topic on this, since the plans have changed from what they were in the beginning?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #158 on: January 24, 2017, 09:54:14 pm »
If you can let me have the details of an easily reproducible render test I can run here, I'd be happy to run it.

Some interesting scores on Unigine Heaven this evening (all tests on a single CPU's PCIe lanes)...

SLI standard bridge x16+x16: 73.2 fps
SLI standard bridge x16+x8: 71.5 fps
SLI high bandwidth bridge x16+x8: 71.2 fps

So the $40 HB bridge for this test functions better as a paper weight. I looked at the xrays of these HB bridges last night. Although they have some additional thought put into the PCB layout, their primary function is as a passive device. A nice way to make a few bucks, and the way they're constructed and presented in the packaging almost puts them into Audiophool territory.

I have a 40mm SLI HB bridge coming my way tomorrow to allow an x16+x16 HB SLI test, but I can't help but feel I've been shafted already!
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #159 on: January 24, 2017, 11:11:47 pm »
Here are two rendering tests that I've found...

https://render.otoy.com/octanebench/

https://corona-renderer.com/benchmark/

Those are indeed interesting results. The 16x16 did the best, and even better than the HB test showed in 16x8.

As for your HB 16x16 on the way, is it possible that the rest of your setup might not be taking full advantage of your HB hardware? Money going to waste is a terrible thing...
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #160 on: January 24, 2017, 11:37:27 pm »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #161 on: January 24, 2017, 11:49:56 pm »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #162 on: January 24, 2017, 11:53:31 pm »
I am not sure that I am following?

Is it not possible to have a storage computer where access will be quick enough between the two computers?
 

Offline Fsck

  • Super Contributor
  • ***
  • Posts: 1157
  • Country: ca
  • sleep deprived
Re: Proofreading my computer design and building a server rack station.
« Reply #163 on: January 25, 2017, 12:40:21 am »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.

assuming that you only need a point to point connection, that's only 60$ USD. 30$ per adapter (ebay, of course), usually include a short DAC (direct attach copper) cable.
if you really hunt, you might be able to shave it down a few bucks, but that's a trivial amount compared to the rest of the setup.

ethernet is inherently high latency. low latency connections use infiniband (IB).
You need to define quick in terms of throughput and latency.
40G would probably cost you 300-600$ USD for the setup, probably not worth it.
« Last Edit: January 25, 2017, 12:43:23 am by Fsck »
"This is a one line proof...if we start sufficiently far to the left."
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #164 on: January 25, 2017, 01:06:36 am »
I'm watching videos and reading some articles right now, but in case of anything, can someone please give me a quick explanation of setting up this network?

What equipment it will require.
What will the costs be for something fast enough.
Ethernet vs Infiniband. (I know Linus uses an Ethernet server.)

Thank you, I will be back in a little while.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #165 on: January 25, 2017, 03:07:53 am »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.

Doesn't that depend on the link aggregation mode?  Round-robin should allow a single stream to take advantage of the combined throughput.

I would consider doing this with 1Gbps ports.  10Gbps ports are still too expensive for most budgets.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #166 on: January 25, 2017, 03:22:25 am »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.

Doesn't that depend on the link aggregation mode?  Round-robin should allow a single stream to take advantage of the combined throughput.

I would consider doing this with 1Gbps ports.  10Gbps ports are still too expensive for most budgets.

LACP does not support round-robin. Anything else results in fun and games - and the switch behaviour is likely to be undefined.

Trunking works great for one-to-many, not so much one-to-one.

10GigE can be had for around $50 for older -CX4 cards, but they're very range limited. 10GBASE-T will run more along the lines of $100-150 for a card, but normal cable can be used at decent ranges. Not unattractive pricing if you need it - NBASE-T should improve matters in the near future, hopefully.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #167 on: January 25, 2017, 03:55:10 am »
I think I was indeed confused about something. A NAS is what I need, correct?

The basic idea is that both my editing PC and my rendering PC have direct access to the same files, without any copying.

For instance...

PC 1 = Editing
PC 2 = Rendering
PC S = Storage

I want to start a movie project on PC 1, but I want all video clips to be stored on PC S, and the project file once finished, I want to be stored on PC S as well.

With all files on PC S, I want PC 2 to be able to access these files immediately after the project is saved. Once the project is saved and finished on PC 1, then PC 2 can begin the rendering.

Neither PC 1 or PC 2 will store any files, but share the same storage system that is PC S, so that both PCs can access the projects without extremely long copy/paste times.


I think this should be possible with a NAS, right?

I would need both PCs to run through a switch, correct? They would then connect to the storage, which would have a network card. Ethernet cables should be used?

I want whatever is fastest for file access between two PCs. I will be constantly reading and writing to this storage device, but I believe the read speeds are most important. However many Gbps that should be?

It needs to not make the rendering process lag behind, making it slower than if I had just copied the files over and then rendered them. I think this should possible right?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #168 on: January 25, 2017, 06:08:46 am »
Those NAS and SAN terms are confusing, what you need is something where you can expand storage on the go and create iSCSI disks to be shared between computers. So basically it's a combination of NAS and SAN. Lets not get too technical and too much in to the nittygritty of the differences between the two. :-D
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #169 on: January 25, 2017, 06:10:50 am »
Those NAS and SAN terms are confusing, what you need is something where you can expand storage on the go and create iSCSI disks to be shared between computers. So basically it's a combination of NAS and SAN. Lets not get too technical and too much in to the nittygritty of the differences between the two. :-D

And what filesystem are you suggesting to use for this NASAN?
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #170 on: January 25, 2017, 06:47:08 am »
I am somewhat familiar with the term iSCSI. NAS or SAN, I need to know exactly what kind of equipment I should start looking into, so thank you slicendice for beginning to clear this up for me.

Of course, worth mentioning is the fact that I only need this to work for these two PCs if anything. I currently don't need an entire business network of 20+ computers, so keeping it simple and keeping data transfer/access quick is the best option.

Please do warn me however, if this will wind up a slower process than moving the files with an external hard drive or anything of the like. I don't want a drop in performance if I will be spending so much on the four total PCs. Otherwise there would be no point.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #171 on: January 25, 2017, 07:09:04 am »
And what filesystem are you suggesting to use for this NASAN?

Yes, NASANEXPERIMENTALALPHAHYPERULTRADELUXEFS!

Seriously, what kind of question is that? Can you create NASAN FS on your Windows Box? :-DD

I said let's not get too technical. We only need something that can hold hard drives, preferably in a storage pool but other options are great too, where we can add and remove physical drives on the go, and create and remove iSCSI disks or shares, that other systems can connect to over a network. This drive pool can be on one of the computers or on a separate NAS device, does not really matter, as long as every computer in the network can access them with decent transfer speeds.


If on Windows, then NTFS would be the best option, if on Linux then EXT3/4 would be great, but as we are not going to use any Linux, we can forget about EXT3/4.


EDIT: Sorry, I misunderstood the question, but my answer is above. NTFS!
« Last Edit: January 25, 2017, 07:10:51 am by slicendice »
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #172 on: January 25, 2017, 07:10:28 am »
And what filesystem are you suggesting to use for this NASAN?

Yes, NASANEXPERIMENTALALPHAHYPERULTRADELUXEFS!

Seriously, what kind of question is that? Can you create NASAN FS on your Windows Box? :-DD

I said let's not get too technical. We only need something that can hold hard drives, preferably in a storage pool but other options are great too, where we can add and remove physical drives on the go, and create and remove iSCSI disks or shares, that other systems can connect to over a network. This drive pool can be on one of the computers or on a separate NAS device, does not really matter, as long as every computer in the network can access them with decent transfer speeds.

If on Windows, then NTFS would be the best option, if on Linux then EXT3/4 would be great, but as we are not going to use any Linux, we can forget about EXT3/4.

If you want to share a single iSCSI target between machines you're going to need a clustering filesystem. Which one do you suggest to use on a desktop Windows system?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #173 on: January 25, 2017, 07:21:18 am »
If the fear of data corruption is an issue then, NFS would be a good option, so we can enable file locks.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #174 on: January 25, 2017, 07:24:13 am »
If the fear of data corruption is an issue then, NFS would be a good option, so we can enable file locks.

So a typical distributed filesystem. Not a clustering one. No iSCSI needed, then, so.. NAS, not SAN.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #175 on: January 25, 2017, 07:27:50 am »
Maybe you should try to help build this computer instead of trying to pick on me? iSCSI is a very good option as it will look like any HDD on your local machine and is therefore convenient. I use that all the time without any issues.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #176 on: January 25, 2017, 07:29:45 am »
After doing more research, would a NAS/SAN really be a lot slower than a DAS? I need to archive a lot of 4K footage, but I won't need to access it all of the time, so I don't need them always active on the network...

Is there any other option other than a network, that would be faster than a network, by chance?

It is all about speed with this workflow.

Examples...

Using a NAS/SAN, would the read/write performance, or access of these files for video/editing/rendering be significantly slower over said network, than using a portable storage solution that can be directly connected to each computer?

I don't suppose hooking up HDDs between the two computers via SATA (the usual method) would be faster.

I would prefer the fastest solution to communicating between the two computers? If it is indeed NAS, then I will definitely be going for that. I just want to make sure. I'm not all too knowledgeable with this, so other opinions on which solution is fastest really helps out.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #177 on: January 25, 2017, 07:31:24 am »
Maybe you should try to help build this computer instead of trying to pick on me?

I'm not picking on you, I saw an interesting point of confusion and I'm trying to get you to think it through.

Quote
iSCSI is a very good option as it will look like any HDD on your local machine and is therefore convenient.

And that's exactly why it can't work as your first post suggested. iSCSI is a block-level access mechanism. You can't just share one physical device with two active systems with a normal filesystem.

This is essentially the differentiation between NAS and SAN: NAS provides a distributed filesystem which abstracts from the block layer and the true filesystem, SAN provides block-level access and is either used for a single client or with a clustering filesystem to allow multiple clients to co-exist.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #178 on: January 25, 2017, 07:34:52 am »
The main protocols used for DAS connections are ATA, SATA, eSATA, NVMe, SCSI, SAS, USB, USB 3.0, IEEE 1394 and Fibre Channel. So this is essentially any drive you have connected to you local computer and is not accessible to other computers.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #179 on: January 25, 2017, 08:10:56 am »
As for your HB 16x16 on the way, is it possible that the rest of your setup might not be taking full advantage of your HB hardware? Money going to waste is a terrible thing...

So the Nvidia control panel thingy tells you about your SLI status, and it warns you if you're only using a standard SLI, helpfully giving you a marketing link to help you purchase a new paper weight, allowing you to remove that warning.

More seriously, it is very likely to be due to the nature of the test itself as to whether or not it takes any significant advantage of the HB SLI bridge.

What I don't know is what the SLI bridge is actually used for, i.e., is it only for synchronising and coordinating multiple displays, or does it also have other benefits such as when being used as a compute or rendering engine.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #180 on: January 25, 2017, 09:38:18 am »
Alright, so I have two options in front of me then...

Either I make a NAS/SAN, and connect these two computers over a network, or it is back to compromising again.

Since more cores = faster rendering, but higher clock speeds = faster editing, I could essentially build two super computers instead. One that will be extremely efficient for editing both video and CAD/Unreal Engine 4, and a rendering machine that will be efficient enough to act as both a super high speed rendering machine that will make up for the time the files will take to transfer from one PC to the other, and still allow me to browse without phasing out the computer.

Which approach seems more logical to any of you?

Tomorrow I will be more specific on a possible budget (in the 5-digit range), and exactly the work that will be done. Thank you all individually for your help!

Howardlong: Alright, I suppose that could be it. The thing about the paper weight joke was funny by the way.

Let me know if that 16x16 works out for you!
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #181 on: January 25, 2017, 06:57:25 pm »
Here is the x16+x16 HB SLI benchmark. Maybe not a paper weight after all.

 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #182 on: January 25, 2017, 07:13:55 pm »
The OctaneBench won't support GTX10xx cards it appears. If you have another benchmark that tests GPU compute/render I'm happy to give it a go.

CoronaBench, which I don't believe uses the GPU, gave me this result which seems in the ballpark of their leaderboards (my 22 core with 5% o/c was only a second behind a 2697v2 which is rather satisfying!)

« Last Edit: January 25, 2017, 07:15:29 pm by Howardlong »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #183 on: January 25, 2017, 07:45:04 pm »
With just the SLI 16x16 it peaked at 73 fps, and the HB got 76 fps at 16x16. I wonder why it topped the last score this time on 16x16, when it actually lost out when you were running it 16x6 SLI vs 16x8 SLI HB?

I looked into it, and you are right. It doesn't support the GTX 1080 after all. I'll keep looking, but I haven't found anything I feel is reliable for this test yet.

On the Corona Renderer, were you using the 2696 v4 22-core? That is an amazing turn out on simply the CPU. Corona renderer is indeed solely CPU based, and that is quite amazing!

More cores equals faster rendering times, which makes dual and quad socket mother boards extremely helpful when rendering very high definition videos or scenes.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #184 on: January 25, 2017, 08:28:39 pm »
Alright, I put this in a second post because it is a different topic...



My budget range that I would like to stay beneath is around $15,000-$20,000 on this entire set up.

As I said, I have some room on what I can spend. I would like to save money when possible, but not at the price of good quality and the price of a professional workstation.

I work as part of a small business and I am the one in charge so to speak, and we work on a wide variety of artistic projects. Most usually come together in one way or another.

As far as video editing goes, we record at real 4K, and these files per hour will consume basically 300-400GB. Usually I am working with thirty minutes of 4K video a time, and need to be finished in a week's time.

Every three weeks we are looking at around at least 2TB.

I'm not concerned over storage right now, the reason for the above description is for a computer fast enough to handle editing of such high volume content.

Affects will be applied, transitions, overlaying, and a lot of cuts and edits.


The CAD/3D modeling side of things comes in for CGI/video game creation. I will be working in Unreal Engine 4. For the most part, the actual games themselves (at the moment) are not extremely massive, but the graphics are rather intense. If you have seen Unreal Engine 4, you know what I am talking about.

I need to both manipulate 3D models quickly in Blender/Maya, and program this game and test it in Unreal Engine 4 as quickly as possible.

We don't work on video games as often (as of now, this could change depending where it all leads), but I would prefer to be future proofed for possibly needed upgrades.


Obviously, all of the above needs to render out somehow, so that is why a separate rendering-based computer seems to be a logical choice to me. Quick rendering is obviously a must.

There is a high volume of content in this department, that needs to meet specific deadlines, so that is why speed is so important.



On the music production side, I have that mostly covered myself, as it is more my area of expertise, and that is also where saving some money was coming in, as I need to spend something on new recording equipment.

I'll be getting Pre amps, DACs, a sound card, possibly.

https://www.sweetwater.com/store/detail/MKH416?adpos=1t1&creative=169711078247&device=c&matchtype=b&network=g&gclid=CPGguoidpdECFdGIswodFLMJpQ

https://www.sweetwater.com/store/detail/B2BomberDAC

https://www.sweetwater.com/store/detail/MP2NV

That is an example of what my set up will start looking like. That will run me about $5000 dollars to achieve, which is where the 2? PCs for $15,000 is a more efficient price tag, but do not let it distract you from a professional quality set up. Ignore an increasing budget, and just keep in my mind what I said I was willing to pay for the PCs up above.

This will also be the equipment running to that second PC that will also be a rendering machine. Both can make use of multiple cores, so I can render and record simultaneously.

I work in Reaper 5 DAW mostly, which isn't too demanding until you start plugging all of these things in and loading up VSTs and plug-ins. Which I do a lot of.


Besides all of that, I personally spend a lot of time in adobe Photoshop editing pictures, and Adobe illustrator designing various things. Both programs can be very CPU draining and really only utilize 1 core out side of some different effects. I think I could hop between the two computers on these programs. Whatever I build up, I'm sure can handle it, but my images are sometimes upwards of 10,000px so they are massive, space consuming, and can slow things down, which again means that I need high speed and multi tasking.


So I think I am looking at two individual PCs...

A video/video game editing computer...
A rendering/recording computer... (which will also be used for internet/email/browsing)

I think that pretty much covers everything that I will be working with.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #185 on: January 25, 2017, 08:49:35 pm »
OMG! I knew Adobe Premiere was bad at GPU assisted encoding but I never thought it is this bad. This is ridiculous for that kind of price tag.
I can't find all the good encoders and editors, I used to have, anymore. The were total Premiere killers regarding speed and quality for all encoding formats. I bet they would still be. ...if I could just remember the filenames and pages I got them from...hmmm...

Here is a comprehensive Premiere Benchamark.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #186 on: January 25, 2017, 08:53:43 pm »
I can't explain the lack of performance benefit on x16+x8, other than I guess there was a bottleneck in there with that test.

I found a benchmark called Blenchmark v1.06 which uses Blender v2.78a. You have a fair bit of control over how and where it is run. Once I'd figured out how to run it, here are the figures for the render:

Code: [Select]
CPU 2 sockets, 22 cores.......65.96s
GPU single....................61.55s
GPU dual SLI OFF (same CPU)...33.24s
GPU dual SLI std (same CPU)...33.53s
GPU dual SLI HB (same CPU)....32.93s
GPU dual SLI std (diff CPU)...33.11s
GPU dual SLI HB (diff CPU)....33.55s

So for this test, I am not convinced an SLI bridge has any function for computed rendering tasks, I think it might only be used for real time physical display rendering.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #187 on: January 25, 2017, 09:34:16 pm »
Slicendice: You were definitely right, Adobe Premiere will not be utilizing much GPU then. Of course, on occasion I could be downscaling videos from 4K to 1080p for YouTube perhaps, and I will be using some of those GPU assisted transitions, but obviously the CPU will be the main focus of the video editing process then.

In this case the GPU focus should be orientated towards the CAD/3D modeling projects when choosing the graphics card.

Howardlong: I would agree with you based on both those benchmarks and continued research. GPU accelerated anything is mostly going to affects real-time manipulation over anything else, so as far as rendering goes I will focus on CPU = more cores = faster rendering.

GPU focus will be aimed at real-time video editing, and mainly CAD/3D manipulation content.

 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #188 on: January 26, 2017, 04:46:15 am »
A quick check list of the components for the two PC build...

Computer 1 - The video editing/CAD, 3D modeling/Unreal Engine 4 build

CPU - It needs to have a fast clock speed, more than 10 cores will not improve performance greatly.

GPU - It needs to be able to perform well for 3D manipulation. Something like Quadro, but it doesn't have to be if there is a better alternative.

RAM - I will probably need decent DDR4 128 GB 2400-3000 MHz.

Motherboard - I suppose this machine can work fine enough or possibly even better in a single CPU build. What do you think?

Storage - Should I set up a raid on this computer? Obviously, an SSD drive for the OS. Should the programs be installed on a separate SSD as well, or on the same one? And what about the video assets. They'll be moved to another storage device after editing is finished anyway.



Computer 2 - The video rendering/music production/general purpose/browsing/internet computer

CPU - I am thinking of going for plenty of cores. Either a dual CPU or quad CPU build I think would be appropriate for future upgrades. The cores should preferabley be fast enough, but more is better here.

GPU - it doesn't have to be as much as the other build, as this will be rendering. As slicendice and Howardlong have both proved, GPU accelerated rendering doesn't help out much here.

RAM: Probably the same as the last machine. DDR4 128 GB 2400-3000 MHz.

Motherboard: I'm thinking a dual CPU build, or just in the last two days, after looking at some rendering benchmarks on Corona Renderer, a quad CPU system might be a good investment...

One question. Can I get away with not immediately occupying all 4 CPUs? A quad build would be great for the future and upgrading once I have the spare money, but having the board run with maybe two at first could be a good idea.

Storage: I would probably set up the same thing as the other build, I suppose?

Thank you to everyone again! ;- )
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #189 on: January 27, 2017, 02:33:27 am »
Another question that just came up?

https://www.amazon.com/RT-AC5300-Wireless-Tri-Band-AiProtection-Complete/dp/B0167HG1V6?th=1

Would this router be a good choice? I need consistent and extremely quick internet connection, and I would intend to connect my PCs directly to the router. What do you all think?
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #190 on: January 27, 2017, 03:15:32 am »
Another question that just came up?

https://www.amazon.com/RT-AC5300-Wireless-Tri-Band-AiProtection-Complete/dp/B0167HG1V6?th=1

Would this router be a good choice? I need consistent and extremely quick internet connection, and I would intend to connect my PCs directly to the router. What do you all think?

Good grief, no, buy something serious, not a toy.

Many seem to enjoy these: https://routerboard.com/RB2011UiAS-IN

Personally, if money is apparently no object, this is more like it: http://store.netgate.com/SG-2220.aspx
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #191 on: January 27, 2017, 04:48:56 am »
Would this router be a good choice? I need consistent and extremely quick internet connection, and I would intend to connect my PCs directly to the router. What do you all think?

Build a router out of old PC hardware or embedded x86 hardware (1) if you must have the smallest form factor and lowest power and run Pfsense, t1n1wall, or SmallWall on it.  If you must have wireless also, then use separate Ubiquiti access points; I like their NanoStation Loco for indoor use but I hear their Unifi stuff works well also.

If you want better network security, then use routing between systems to isolate them instead of just an ethernet switch.  A VPN switch can be used as an ethernet port expander to help with this.

(1) Like from PC Engines or Netgate although Netgate apparently no longer sells low end inexpensive hardware.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #192 on: January 27, 2017, 08:26:11 am »
Monkeh, I looked at the page of the last one specifically going through its specifications, but it doesn't mention a whole lot about its internet access/speed?

Is it fast enough? I use Internet Explorer 11 mostly? Sometimes chrome (extremely slow).
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #193 on: January 27, 2017, 08:49:01 am »
I'm not too familiar with exactly how routers/modems work. I'm reading up on it now, so I can keep up and follow what is being talked about.

Regarding homebrew routers, I was reading this article, which is kind of interesting...

https://arstechnica.com/gadgets/2016/01/numbers-dont-lie-its-time-to-build-your-own-router/
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #194 on: January 27, 2017, 11:26:54 am »
The more expensive routers from ASUS are no toys! They have most functionality that every good router has and even more. The thing with those routers is that they are targeted for professional gamers, so the management UI and such is very visual, while the the best routers from the enterprise line use console only over a com port or/and LAN.

EDIT: A quick question. What do you need the router for? A simple switch should be enough to get the computers to communicate. You don't need a router unless you need to separate computer networks. Your ISP connected router/modem should be enough to get internet access for the devices that need such.
« Last Edit: January 27, 2017, 11:43:57 am by slicendice »
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #195 on: January 27, 2017, 02:24:12 pm »
FWIW, I have always been pretty happy with Draytek 2860n+ as a router solution for home office applications, but I've used Draytek routers before for small satellite office deployments. I have mine set up for auto failover between an ADSL and cable connection, and as a VPN server for remote access. Config for failover was a bit fiddly, but once it's set up it's rock solid. My provider bandwidth is 200Mbps downstream and it seems to cope well enough with that.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #196 on: January 27, 2017, 03:35:18 pm »
Mine is an Inteno GD200AL-AC, pretty much (if not even exact) the same features as the one Howard uses, except this one has AC WLAN support. The Web management UI is a bit fiddly, but for all my networking purposes it does a really good job. I have connected all my computers to an unmanaged switch (TP-Link 5-port TL-SG105) which is in turn connected to the Inteno. All works really well together.

Next week my net gets upgraded from 100Mbit VDSL2+ to 350Mbit Ethernet WAN over Fiberoptics.

All these modem/routers are toys compared to enterprise grade devices, but who needs that at home? Not likely we are going to run a 1000 Server Cluster in our livingroom, or something, are we???  :-DD

These consumer devices are really only as good as their software.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #197 on: January 27, 2017, 03:52:43 pm »
These consumer devices are really only as good as their software.

This neatly summarizes why I never use consumer grade routers anymore; their firmware and usually their reliability is almost always poor.  ISP issued routers are even worse.

Every one of the D-Link routers I bought many years ago are dead; I suspect NAND Flash failure.  My even older dial-up era SMC routers still work though.  Neither ever had good firmware or a functionality.

On the other hand, my 10+ year old PC based FreeBSD router has outlived them all and 4+ ISP modem/routers, has excellent software, functionality, and reliability, and it is much more secure.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #198 on: January 27, 2017, 07:38:29 pm »
Well, at the moment my internet connection is absolutely terrible at home. I'm working with an ISP router/modem thing, and it is terribly unreliable and the internet connection is as slow as it gets.

The main purpose in a decent router and a decent modem setup would be to insure stable and extremely quick internet access.

A lot of my work gets transferred through the internet. I also build websites for people, and the internet access being quick is very important. Besides that, I do a lot of uploads and web browsing, so a fast internet connection (a lot faster than what I'm capable of right now) would be helpful. I'm tired of  everything freezing, and having to break out the task manager every five seconds...
 

Offline DimitriP

  • Super Contributor
  • ***
  • Posts: 1307
  • Country: us
  • "Best practices" are best not practiced.© Dimitri
Re: Proofreading my computer design and building a server rack station.
« Reply #199 on: January 27, 2017, 08:56:25 pm »
Quote
I'm tired of  everything freezing, and having to break out the task manager every five seconds...
What is this machine runnin for HW and OS etc etc etc ?

   If three 100  Ohm resistors are connected in parallel, and in series with a 200 Ohm resistor, how many resistors do you have? 
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #200 on: January 28, 2017, 02:36:56 am »
The hardware I'm running right now? Or the hardware in the PCs I am going to build?

The hardware right now isn't much at all. 8 GB, 4 cores, and some cheap AMD graphics car. It's Windows 10 Home.

I do need Wi-Fi as well, though. There are people with laptops who will need to connect wirelessly.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #201 on: January 28, 2017, 09:40:02 am »
Editing computer current parts list...

CPU: Intel 6950X or a 6900K?
GPU: GTX 1080 or a Quadro M5000?
RAM: Corsair Vengeance 2300MHz 64GB X2
Motherboard: ASUS X99-Deluxe
Storage: Intel 750 series or Samsung 950 pro? (Question: should I install both the OS and Adobe/Unreal Engine 4 on one SSD or separate them on different SSDs?)


Rendering computer current parts list...

CPU: Intel Xeon E5 2696 v4 or intel Xeon E5 2699 v4?
GPU: GTX 1080 or a Quadro M5000?
RAM: Kingston ValueRAM 2133MHz 128GB
Motherboard: ASUS Z10PE D16 WS
Storage: Intel 750 series or Samsung 950 pro? (Question: should I install both the OS and Adobe/Reaper 5/Omnisphere 2/ on one SSD or separate them on different SSDs?)




Thank you everyone for continuing to stick with me throught this learning and building project, I really do appreciate it! ;- )
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #202 on: January 28, 2017, 12:33:43 pm »
You can safely install the software on same drive as the OS, but put the working folders to a separate drive. Unreal Engine 4 with all it's assets can be very demanding, and as you want speed, then it could be a good option to use an SSD for the projects and just normal HDDs for archiving already done stuff.

An alternative would be to install the working drive as an RAID array of normal HDDs, as the speed should be enough then and long time reliability would be much better compared to SSDs (not that SSDs die that fast anymore compared to years back).

I have ran UE4 on my laptop, which isn't that fast, it runs well, but the experience is not that smooth in every corner. I had the app installed on my boot drive (SSD 950 EVO) and the assets and projects on a 5200RPM 2.5" drive. SSD would have been much better for the projects. I used FHD and less content only, if you need that 4K then of course you need faster components. But anything desktop today will beat my laptop, no matter how low end it is. :-)

Basically what I am trying to say here, is that have separate app and project drives. And the project drive may need and in some cases even need a bit more speed than a single HDD can give (80-160MB/s transfer rates).
« Last Edit: January 28, 2017, 12:36:41 pm by slicendice »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #203 on: January 28, 2017, 07:23:41 pm »
Alright, I understand what you are saying, thank you!

Thank you for all of your help, slicendice. You've been consistently helping me this entire time since the beginning of the year, and its been very helpful and I have learned a lot about the build I am trying to make.

About the networking thing from earlier. Would that still be a good idea, or like I was saying, just transfer the files between each other with external hard drives? Can I have two computers communicating with each other without a third "storage unit in-between, if anything?

I'm still trying to find out what about this internet connection. If the router I was going to get is no good, then what exactly should I be looking into, anyone?

I need high speed internet connection that I can plug directly into with LAN/WAN, (whichever is faster?) and I do need Wi-Fi as well for laptops. Would two separate routers be a good idea? I'm not very knowledgeable with routers/modems.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #204 on: January 28, 2017, 08:13:59 pm »
About the networking thing from earlier. Would that still be a good idea, or like I was saying, just transfer the files between each other with external hard drives? Can I have two computers communicating with each other without a third "storage unit in-between, if anything?

I am not exactly sure what you are trying to accomplish at this point but I use a dedicated ethernet link between my two workstations and two file servers which is not part of my internet connected LAN.  So I have two completely separate networks.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #205 on: January 29, 2017, 01:24:24 am »
Well, that is more or less what I'm asking about? I'm not too sure which would be best?

Can I have two workstations connect to each other directly? At first I was trying to have two computers connect to a third (storage unit) so they could both communicate to the same files with each other. Could I set it up that the rendering PC could directly communicate with the files on the editing PC?

If that isn't a good idea. Again, can I have some assistance in setting up a good and quick third unit to run all of the assets through. Programs would be installed on the editing and rendering PCs but all videos and assets would be on that "third/storage" computer.

I'm thinking about starting a new topic dedicated to the networking solution in order to get more attention from other users who might be able to add something in networking. My title as of right now on this topic doesn't invite very many people to the concept I'm looking at for networking.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #206 on: January 29, 2017, 08:36:56 am »
Can I have two workstations connect to each other directly? At first I was trying to have two computers connect to a third (storage unit) so they could both communicate to the same files with each other. Could I set it up that the rendering PC could directly communicate with the files on the editing PC?

That is absolutely possible and essentially what I did and still have although if you are using Windows, it is sometimes difficult to configure with the newer versions.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #207 on: January 29, 2017, 08:47:02 am »
That sounds like good news. As long as I can keep the connection running quickly and seamlessly.

I do so happen to be running Windows 10 Pro, and if possible I would like to keep it that way on my next two PCs in this build. Windows 10 might not work for everyone, but I do actually take use of some of its features that other OSs don't provide.

I did make another topic specifically for discussing a network solution if youthink it is best moved over there, but I intend to keep this topic going as I'm going to further proofread through my components and assure I have the most suitable setup for everything. Thank you for your comments, David! I would appreciate it if you could further explain how this could essentially be done?
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #208 on: January 29, 2017, 03:58:44 pm »
I am still not sure why you would want this so let me cover what I have and why.

1. I have a big 100 Mbit/s LAN which connects to my internet gateway.
2. I have a separate 1 Gbit/s LAN which connections only between my workstations and servers.
3. The internet LAN is only 100 Mbit/s because I have an old 24 port switch and my internet connection is not that fast and keeping the large switch was was cheaper than upgrading everything to 1Gbit/s.
4. Having a separate 1 Gbit/s connection between my workstations and servers is arguably more secure because the servers do not have to be directly exposed to the LAN.
5. I do have a separate 1 Gbit/s switch now but it is small.

Today I might do the same configuration with a big 1 Gbit/s switch for my internet connected LAN and a separate small 10 Gbit/s switch for my workstations and servers.  If I only had 3 total workstations and servers, I might even do without the 10 Gbit/s switch and directly connect them together like I did when I only had 1 workstation and 1 server.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #209 on: January 29, 2017, 08:54:03 pm »
According to some of the Linus videos I have just seen, I might have to really start looking at high-end GPUs again. While under normal circumstances, Adobe Premiere doesn't utilize the GPU, it would seem if it is transcoded to Cineform, that would change.

Corrections if I am wrong about something here?

If not, I imagine a Quadro still wouldn't help. Would a TITAN X do significantly better than my GTX 1080 choice from before, at least in this situation? Or am I still better off with the GTX 1080?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #210 on: January 30, 2017, 08:32:07 am »
Exactly as David says.

But since you need WiFi access too it's best if you have a configurable switch between the computers and then add Wireless access points (you might need many if your work area is big and want to keep max speeds anywhere in the house) to that switch. This way any device can connect to any computer on your internal network, depending on your switch configurations. Since you need to do a lot of data transfers, a small 10Gbit switch would be the best bet.

Really good configurable switches cost a lot, but it's worth the investment in the long run. With a proper configurable switch you can separate some computers completely from internet if you want.

A computer to computer network without a configurable network switch in between can be difficult to get right the first time and you might need a lot of network cards. It will also be a bit tricky to maintain if you some day need to add more network cards. A switch, once configured properly is just as easy as plug and play.

But as I said before, start with the main editing computer build first and once that device is up and running and you know it's performance using your tools, expand to possible external storage solutions and the rendering computer so that others can keep editing locally while something is rendering elsewhere. You only want to keep your current project(s) locally, the rendering jobs should be sent to the rendering computer and the rest should be sent to an external unit for archiving. A SAN unit is a good choice as you can expand its storage capacity on demand.

Think about your workflow. I would think it as a factory pipeline where an idea goes in the other end and a final product comes out in the other. A finalized product should not stick around on the editing computers and the all RAW data should be on a separate device, from where people make a local copy to their device. We don't want people to edit the original do we?

EDIT: I see now that you started the new thread for the networking solution.
« Last Edit: January 30, 2017, 08:35:43 am by slicendice »
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #211 on: February 01, 2017, 07:30:24 pm »
Can't disagree with any of silendice's comments directly above.

Increasingly I've been concerned with the approach of building a lot of stuff but then finding out after the event that so e of it isn't fit for purpose. At this end of the market I think it's fair to say there is a fair bit of experimentation before getting to the ultimate solutions, and much of this is dependent on your own workflow.

I would rather aim to match a configuration to my own workflow first, rather than find I need to change my workflow significantly to fit a given configuration. 

If you don't already have a workflow, and this is a completely greenfield build, then maybe it needs some paid consultancy from someone with proven experience in your creative fields, especially bearing in mind the budget you have?
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #212 on: February 02, 2017, 02:31:45 am »
I'm not sure exactly what you mean?

I think I have my workflow basically ironed out at this point. It's a workflow I already have adapted to, more or less. I'm trying to upgrade the business to a more commercial/professional level, but it is something that has been in place for a few years now.

I don't believe I have changed anything necessarily, these are all additional options, or alternatives that others have suggested to me, that I have been taking into consideration. If someone can figure out something better than anything I have in mind, that is really what I have been looking for on that end.

I've explained a few times, the workflow basically consists, of continuously editing videos/video games/music with no slowdown. That has been the priority since the beginning, I have only changed the concept of doing this on single computers.

Going off of my original first post, I was proofreading what I had thought was a suitable build, and I was intending to build quite a few of them 4-5, in order to make the workflow of me and my team hopping between them, which was really too complicated. I did not explain that clearly before, but I never felt it necessary, because we moved rather quickly to the idea of two separate builds.

With the two separate builds in mind, the workflow basically means what it originally did, which is why I want the network configuration to make it as if it really were one computer I was working on, instead of two, with the added benefit of no rendering slowing that workflow down, by forcing me to stop working.

I can be more specifically detailed if you want me to, regarding the workflow.



By the way, I made a topic concerning the server rack specifically, not necessarily intended for these builds but for a rack of other builds, and I'm having a hard time figuring out which components I would need to get this done with, if anyone is interested in sharing their expertise over there?

https://www.eevblog.com/forum/beginners/finding-an-inexpensive-42u-server-rackwith-shelves-in-the-pennsylvania-area/msg1125925/#msg1125925
« Last Edit: February 02, 2017, 06:13:57 am by Lizzie_Jo_Computers_11 »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #213 on: February 03, 2017, 07:57:05 pm »
After watching some more videos, and running into a few more articles, I've found that the graphics card actually can be used to speed up the workflow.

In Linus' "workflow explained" video, he went over the fact that encoding their raw footage to Cineform with Adobe Media Encoder actually does heavily utilize the GPU. Even though we more or less proved (thanks to slicendice) that Adobe Premiere Pro does not utilize the GPU on a regular workload, with the Cineform codec, this seems to not be the case, as it is rather dependent on it, which can drastically speed up Premiere's timeline playback.

I'm going to keep looking into it, but would this affect my current choice of the GTX 1080? Would another card be more suitable, like the TITAN X, or a Quadro, or is the GTX 1080 still the best option?
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf