Author Topic: Proofreading my computer design and building a server rack station.  (Read 50478 times)

0 Members and 1 Guest are viewing this topic.

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #125 on: January 19, 2017, 08:34:05 am »
Quote
The particular MB supports up to 160W E5v4, but it requires v2.0+ BIOS. If you are buying one, make sure to let the seller to flash latest BIOS (2.0b). If running a previous version of BIOS, it may not boot, or may even fry the new CPU.
You can also buy a cheapest E5v3, even ES version, from eBay, and use it as your "bootstrapping" CPU to flash your new MB. A bonus point is that a spare CPU can be handy when diagnosing things.
When I was troubleshooting my system (AsRock X99+E5 2696v3, not v4, I upgraded to v4 later), a BIOS defect makes the system not to boot. This is not fixed until the next BIOS release.
I live in Raleigh, NC, what surprises me is all nearby computer shops, including bestbuy, intrex and office depot, didn't carry any x99 or lga2011-3 for diagnosis, not even for sale. So having your own diagnosis CPU for merely $100 isn't a bad deal, especially if you want to build more 2011-3 system , for yourself or for a friend. If you don't need it anymore, just sell it, you can sell it at almost the same price. Your can forget this advice if you live in a big city like NYC, where you can get diagnosis service for every platform easily.

Thank you again for the advice. I will make certain that is done. The last thing I want is to fry my $1000+ CPUs.

$100 seems like a great deal to me, so if for some reason the seller can't update the BIOS for me, I'll try that instead, it makes sense to me.

That was the board I was looking into, and I think it is a pretty good one, and it's a Supermicro board. Unless someone has a suggestion based on their experience, that would handle the CPUs better on a dual system?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #126 on: January 19, 2017, 08:53:47 am »
Supermicro is the safest bet, it should run stable out of the box. You will not get some fancy features some other boards have. But you don't really need those features anyways, unless you are a computer enthusiast.

Someone with hands on experience for that particular board should help you get things setup properly. Saves you a lot of headaches if there exists some "gotchas" regarding CPU, RAM and GPU.

Best place to ask for help and further advice regarding a specific motherboard would be a forum where people talk specifically about that particular board. Some owners thread would be great, as there you will find out what issues people are having and how to solve them.

It's hard work these days, 20 years ago it was simple as the hardware was quite simple and a soldering iron could fix most issues. But ususally whatever board you purchased worked out of the box with pretty much any components attached to it. Today hardware has become quite complex. ;D
 
The following users thanked this post: 3db

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #127 on: January 19, 2017, 09:06:46 am »
Thank you for your advice, that is what I was thinking about using supermicro. The Supermicroboard I posted, I read on an ebay listing that it wasn't compatible with the Xeon E5 2696 for some reason, but that's not true, right? The BIOS update should fix that, I think?

This is the only other board I've found, which had much better reviews than the D8...

https://www.amazon.com/Z10PE-D16-WS-LGA2011-v3-CrossFireX-Motherboard/dp/B00QC5DZEU/ref=cm_cr_arp_d_product_top?ie=UTF8

Although, what worries me here, is in two of the reviews I've read that E5 2699 v4's were not working with that board for some reason? Which that CPU should be identical to the E5 2696 if I'm not mistaken?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #128 on: January 19, 2017, 09:19:15 am »
According to ASUS, the E5-2699v3 and v4 should work on the latest D16 board. It is important to select the right version of the CPUs depending on BIOS. A v3 is not the same as v4. V4 requires BIOS v3204 while V3 runs on BIOS v0501. That's a HUGE jump in versions. So it seems there has been a lot of issues on this board too.

As I said before, Supermicro might be the safer bet, as their BIOS is more mature and is a lot simpler, without all the fancy stuff ASUS includes.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #129 on: January 19, 2017, 06:50:28 pm »
blueskull: While deciding between the boards that I'm looking at right now, I forgot to ask what version is the CPU you're using? Is it a v4,v3,v2?

I still agree that supermicro might be a better choice, but I'm looking at the ASUS to see if it might have any features that I could possibly need down the line, or would be more useful, just in case.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #130 on: January 20, 2017, 02:06:19 am »
Thank you!

I'm starting to get a better look at what kind of build I will be making at this point. I will post my newly refined build with a link to PCpartpicker later on tonight, with a few more questions, mainly regarding cooling, power supplies, etc...
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #131 on: January 20, 2017, 08:43:35 am »
This is my current build...

https://pcpartpicker.com/list/kjywCy

You can replace the dual Xeons with the E5 2496 v4, as I think that is what I am going to get.

I think I will go with that board, but I'm also looking at the ASUS 16 board as well.

I'm not sure about the coolers I picked? They are liquid coolers, but I don't know if that is a good idea, or I should go for normal fans. I have to for the CPUs in mind, but I'm not sure if I should be getting more for additional cooling for the graphics card, or anything else in general?

I have some additional questions about the power consumption, but after I finish looking it up myself, I will bring it back here.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #132 on: January 20, 2017, 09:22:54 am »
I'm a bit worried about the dual graphics card and the PCIe SSD for this build on this particular MB. It seems everything, including chipsets will be connected to CPU1, not sure that is a good idea. CPU2 will be mostly doing nothing when it comes to managing devices. All CPU PCIe lanes will be occupied on CPU1 by pretty much everything. I don't think you will get even close to max performance from the GPUs in this configuration.

Another thing I noticed is that the RAM is in the way of longer cards for the TOP PCIe slot. That could be an issue.

Maybe someone else has some additional insights to this.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #133 on: January 20, 2017, 07:12:59 pm »
1. I will look into a faster ram, although, from what I have heard, especially in my case, that speed increase isn't too much of an improvement.
2. and 3. Chassis does not matter, I am building my own tower/case for this build completely from scratch. That first post is completely outdated, and there is nothing relevant left in there, so it is not something to be concerned over.

About the power supply questions. I'm still looking into it, but this PC is going to consume 1000+ watts of power, and with the rest of my setup, that could cause power outages. I was thinking about installing another electric breaker in my house for it to run on, and I'm having a hard time finding pricing estimate (at least current ones), as I had done this before, but this was years and years  ago.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #134 on: January 20, 2017, 09:56:20 pm »
Quote
I'm a bit worried about the dual graphics card and the PCIe SSD for this build on this particular MB. It seems everything, including chipsets will be connected to CPU1, not sure that is a good idea. CPU2 will be mostly doing nothing when it comes to managing devices. All CPU PCIe lanes will be occupied on CPU1 by pretty much everything. I don't think you will get even close to max performance from the GPUs in this configuration.

Another thing I noticed is that the RAM is in the way of longer cards for the TOP PCIe slot. That could be an issue.

Maybe someone else has some additional insights to this.

The more I have been thinking about it (although it might be against some user recommendations), I think I might go with the ASUS board. I have heard a lot of good things about them (Linus uses them), and I think it might be a good overall choice.

Would I still face the same problem with that one, I am somewhat confused?
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #135 on: January 21, 2017, 05:33:03 am »
Quote
Both boards use all 80 PCIe lanes from both CPU. Each E5v3/v4 can only provide 40 lanes.
The Supermicro board has 10 slots of x8 lanes, and a x4 lane derive from C612 chipset.
The Asus board has 4 slots of x16 lanes, and 2 x8 lanes, both from CPU. In addition, it provides an x2 m2 slot from C612.

I was reading up on PCI-E/lanes/GPU, and I think I understand it somewhat better. I'm still somewhat confused, and I want to make sure I get this all right.

Nothing should be incompatible, and for example's sake, I will be going with the ASUS board. With that in mind, can somebody explain to me in detail exactly how this works, and the issues that I am facing with the graphics cards, lanes, and the CPU?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #136 on: January 21, 2017, 06:16:11 am »
Puget Systems did a reasoanble range of tests on x8 vs x16. https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/

One thing to consider are the logisitics, for example the Asus X9PE-D8 WS board I use has four x16 slots (2 per CPU) and supports up to four two slot GPUs. However you can't get a GPU with a back plate heat spreader in to the first slot because the DIMM slot tabs hit the board. You could hack the DIMM slots, but it's a design fail. Things like this regretfully you only find out at the build stage, unless you can find someone else who's also done your build and remembered to document the feature.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #137 on: January 21, 2017, 08:12:26 am »
NO, you don't need all GPUs on one CPU. Asus supports 4-way SLI over 2 CPUs/PCIe lanes over 2 CPUs.
NO you should not try to force a 2-way SLI in CPU1 if there is a lot of other hardware attached to that CPU while CPU2 has nothing else to do than pure computation. This was proven by LinusTech in one of his videos where he had build a Video rendering beast (can't find that particular one right now). But the end result was that when GPUs were connected to CPU1 it was slow like hell and after switching to CPU2 the speed improved significantly.

But don't take my word on this, go find the relevant videos and benchmarks yourself.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #138 on: January 21, 2017, 08:58:42 am »
Thank you both for explaining this further, it is very helpful, I appreciate it!

I'm looking for the Linus video right now. I will be back after I have read the above article and I have seen the videos.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #139 on: January 21, 2017, 09:32:09 am »
Try to find the followup on this PC build where he realizes his mistake, I try to see if I can find it too.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #140 on: January 21, 2017, 10:38:08 am »
One additional note about having GPUs connected to different CPUs is that though it works, there will be a performance penalty because the cards don't share same address space. SLI link should solve this at least a bit and provides additional bandwidth for GPU to GPU communication.

I read an article where some company builds Servers with 8 GPUs installed. The article explained quite well the relationship between PCIe lanes, CPUs and PCIe Root Complex and switching and how it all affects performance. Can't find the exact article right now as I just stumbled on it while looking at other stuff.

 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #141 on: January 21, 2017, 10:43:10 am »
Try to find the followup on this PC build where he realizes his mistake, I try to see if I can find it too.

Indeed, I watched that series a while ago. It also is not clear which board he ended up using. He started off with a Supermicro server board, then switched to an Asus. Both he had enormous difficulties with getting them to work with dual graphics cards as I remember, despite being in the phone to their techs for a very long time (a service that mere mortals won't get). Then I believe Supermicro sent him a workstation board where he did get the two GPUs working, although it's far from clear. One chassis we see is an enormous 6U+ rack-mount rig with 8 GPUs and dozens of SSDs but I don't know what that was based on.

Seeing him struggle with those boards was actually very worthwhile, because that is exactly the sort of nonsense everyone else has to put up with when doing a custom build with these lower volume less characterised high end boards. We just don't have a hotline to the motherboard vendor.

I go hot and cold on Linus. He's young and enthusiastic, but he's also quite naive. The recorded videos are generally pretty slick, but often lack depth, I assume he's trying to keep the infamous ten minute attention span. A lot of what he does technically has already been figured out by one of his team of minions, although he rarely mentions that, and it often passes off as his own work as a result. Sometimes on camera, on the live builds for example, he displays quite a nasty side to him with the way he talks to his coworkers if they are trying to fix something in the background.
« Last Edit: January 21, 2017, 10:47:19 am by Howardlong »
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #142 on: January 21, 2017, 05:01:47 pm »
Oh no, I see now that I caused confusion by mistake. Yes, there is no known way today that would enable more than 4-way SLI. Yes in CUDA/OpenCL we think more in compute module and core terms so in theory it could scale infinitely. From the users perspective all SLI does is making 2+ cards look as they were only 1 big GPU.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #143 on: January 21, 2017, 07:49:45 pm »
I watched the video and the follow up video. I think I have figured out what I should know about PCIe lanes, and I see what everyone is saying.

Now, if each CPU runs 40 lanes, would it not be logical that I could run the x16, even with both cards in SLI on one CPU?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #144 on: January 21, 2017, 08:01:45 pm »
Can you point me to some documented benchmarks on running, say, two GPUs across two CPUs as opposed to running both GPUs on a single CPU?

It's not a leading question, I am genuinely interested.

My only direct practical experience of this is on SQL Server dealing with NUMA under VMWare (these days almost nothing's on physical) and to be honest in that respect I rarely have the opportunity to do real world performance analysis other than on my own setups and on live systems, and then dealing with the troubleshooting and consequences of mal-configured VMs. Unlike a decade+ ago, with the increasing use of SANs and VMs, customers rarely replicate their production configs on lower environments, making performance analysis far less deterministic.
« Last Edit: January 21, 2017, 08:08:03 pm by Howardlong »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #145 on: January 22, 2017, 01:13:13 am »
This was the closest thing I found regarding benchmarking of 2 CPU/2 GPU, but I don't think it's exactly what you were talking about...

http://www.ks.uiuc.edu/Research/namd/mailing_list/namd-l.2011-2012/2577.html

I'm perfectly okay with having to run them in any configuration I have to. I might just need the concept of the lanes further explained to me. I know the ASUS board has more than 1 x16 slot, and each CPU supports 40 lanes, so to me (with not too much understanding of the concept, I suppose) it sounds like the 40 lanes are enough to run them both at x16 on only 1 CPU, correct?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #146 on: January 22, 2017, 06:38:39 am »
I don't have any dual GPU machines here as I have little need for them in my work, but I do have two identical GTX 1070 cards here that I could do some benchmarks with to settle this. It would mean taking a tight ITX X99 build apart though, the 1070 needed some case modding to fit it in so taking it out again will take some time.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #147 on: January 22, 2017, 08:50:37 am »
Howard, I would really appreciate that, it would help out a lot. If it's too much trouble though, you don't have to worry about it. Obviously, computer builds are no simple task, and I don't want anyone putting themselves or their equipment through too much trouble.

I'll be back after a little while, looking into some more information on the CPU/GPU, but again, I wish to thank everyone here for taking so much time out in helping me with this build. It is really appreciated! ;- )
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #148 on: January 22, 2017, 10:03:48 am »
Not everything needs to run through the CPU. Most older i7 CPUs has only 16 lanes in total. This where the Root Complex and Switching comes into play. You can have for instance 32 lanes on the chip, and if the only thing it controls is 2 x16 PCIe ports it will switch full 16lanes data from GPU to GPU without having to bother the CPU. This is just an example. Sometimes when building more complex systems with very specific requirements, it can be a good idea to check out how the CPUs, the chipsets and all ports/plugs are wired together.

In your case, you are building a fast general purpose computer so winning(best performance) in one end might make you lose in the other. But it will be possible to find a good balance between the 2 extremes.

That Asus board can't be that bad, or Asus will run out of business pretty soon. If you decide to go for Asus, then make sure you pick also other Asus QVL parts as much as possible. If something for some unknown reason goes wrong, you can blame Asus and get your money back, especially if the parts were Asus own parts from QVL. Just as a hint. :-)
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #149 on: January 22, 2017, 01:16:49 pm »
I've been thinking about this computer build. I've been studying a lot of sites that has recommendations for Professional Audio production and also separately for Professional Video production.

All sites states that no more than 64GB RAM will ever be needed for any tasks in neither. Most even stick with only 16GB but in some cases 32GB is required, when doing some really heavy editing. If you are going to do some video editing at a similar level that Pixar does then you will need Quadro cards for added GRAM, but I'd say 8GB per GPU is more than enough.  Most built devices had a price tag way below $2000. Of course if you want to use SLI then for each GPU card added you can add almost $1000 more to the budget.

For Audio more cores is better, but never did I find any recommendations for tens of cores. Heck, most users did professional audio production on laptops, which are slow compared to desktops.

I'm starting to think this Dual Xeon build is a total overkill and a waste of money and it also introduces a lot of complexity to the system which in turn can make it difficult to get a stable and reliable computer build which in worst case scenario will perform way worse than a workstation with just an i7 CPU, 32GB RAM, SSD and a couple of GPUs.

For a workstation this server type of computer mentality is not required. But if you want to build a rendering server that does many rendering/encoding jobs at the same time, then we can talk server, but I'd still keep the actual workstation separate and somewhat simple. The server could also do other tasks like having a database, working as a NAS and storage for backups and archives etc...then we need more cores and more RAM, until then I think it's pointless.

But in the end you pay so you decide what is best for you.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf