Author Topic: Proofreading my computer design and building a server rack station.  (Read 50564 times)

0 Members and 1 Guest are viewing this topic.

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #150 on: January 22, 2017, 02:48:56 pm »
FWIW here are some benchmarks. Please note that without having access to the software and workflow that's going to be used, these are simply indicative. Also please note that this was about 2.5 hours of effort, including accumulating the parts, so while I'm happy to try any other specific tests, it is reasonably time consuming ripping boards in and out and getting the machine to boot again, plus taking care that the right slots are used.

In all cases, I was using x16 slots that were working electrically at x16.

I didn't have an SLI HB bridge to hand so these tests are using a standard SLI bridge.

Machine is Z9PE-D8 WS with 2 x Xeon E5-2696v2 ES QDUF Bclocked to 105MHz, 2.94GHz turbo to 3.15GHz (restricted to 11 cores/socket), 64GB (8x8GB) DDR3 ECC RAM @ 1680MHz, 2 x MSI GTX 1070 Gaming, Corsair HX1000i PSU.

Code: [Select]
GPUs                      1xGPU    2xGPU     2xGPU     2xGPU     2xGPU
SLI                       N/A      Off       On        Off       On
PCIe x16 bus              N/A      Same CPU  Same CPU  Diff CPU  Diff CPU

Unigene Heaven  fps       41.7     41.1      73.4      41.2      52.5
Cinebench R15 OpenGL fps  89.04    88.95     90.68     91.28     89.04

Cinebench results strongly suggest it only uses one GPU.

Unigene Heaven has two interesting scores when SLI is on, 73.4 fps when both cards are on the same CPU but only 52.5 when each card is SLI'd across two CPUs. Again, please note that this is not necessarily representative of your workload, it is probably more appropriate to gamers.
 
The following users thanked this post: MK14, Lizzie_Jo_Computers_11

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #151 on: January 22, 2017, 02:53:42 pm »
I'm starting to think this Dual Xeon build is a total overkill and a waste of money and it also introduces a lot of complexity to the system which in turn can make it difficult to get a stable and reliable computer build which in worst case scenario will perform way worse than a workstation with just an i7 CPU, 32GB RAM, SSD and a couple of GPUs.

I have to agree, if it were just one job then it would be easier to come up with an optimised solution, but there are several somewhat conflicting requirements which means it'll have to be a compromise one way or another.

My only caveat to that is that the benefit of having a gazillion cores is that you can get on with other stuff without being hindered while a couple of renders are going on.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #152 on: January 22, 2017, 07:57:56 pm »
Howardlong: Thank you for all of the effort you put into that! I really appreciate it, and I know it must have been a lot of work on your part, and time consuming. That is more than enough, and definitely worth it, so thank you again.

It seemed X2 GPU/SLI on/Same CPU had the overall best results, with the fps being significantly
higher, from 40 to 70 in unigene, but there was never any significant case when the fps rose in
cinebench, in fact when the dual GPU was enabled in the first results, it actually dropped.

The results didn't show that big of a performance increase with the fps, but I believe there would be even greater improvement when video editing, specifically when scrubbing the timeline, and previewing edits. Adobe premiere can make great use of CUDA cores, so that is a given.

It should bring an even bigger impact when rendering in CAD programs like blender, which runs primarily on GPU acceleration, and in Unreal Engine 4 which will benefit from more GPU/CUDA cores. Which makes your results helpful either way, especially when it comes to this particular task.

slicendice: I actually do agree with your point (and Howard's as well) about the issues with balancing the different tasks at hand, which is why earlier I brought up the concept of different builds all in one massive tower, opposed to a general purpose machine, but I decided it might have been easier with a single machine, but if it's going to greatly impact everything, perhaps we should build several instead. One for each purpose?

slicendice, you're definitely right about the RAM/cores on the video editing machine leveling out at 64GB and giving less return after around 6 cores. Clock speed, it seems, would be most important.

Howardlong, I agree that a lot of what I was thinking with all of the cores was multitasking while renders are going on.

However, with separate builds, it wouldn't make a difference, I suppose.

What do you think? Would separate builds provide overall better experience with each set of tasks at hand. I think it would.

As a quick layout (although after hearing everyone's opinion on separate builds, I will be far more detailed in this description), I think this is what I could be looking at?

A video editing computer.
A rendering server-like computer.
A general purpose (though less intricate than what I have now) dual Xeon system.

One thing I have to look into, is if CAD programs run enough similarities to either a rendering server, or a video editing computer, so that I can figure out which between them I can build up based on their mutual specifications for a professional grade computer.

Thank you again everyone!
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #153 on: January 23, 2017, 08:23:32 pm »
From research, I have discovered that both video editing workstations and CAD workstations operate using essentially the same kind of specifications.

Although, in a video editing computer, the CPU is the most important piece.

In a CAD workstation, the GPU is the most important piece.

I think I might be looking a 2-3 separate computer builds?

I'll start with the video editing build (also, it's likely the CAD build).

It seems that in a video editing environment, that a high CPU clock speed is more important than multiple cores. The cores are useful to a point, but beyond 8-cores, there is usually a drop-off in performance. With that in mind, I have been looking at the i7 6900K and the i7 6950X. They clock beyond 3.7 GHZ with as many cores as I'll ever need for video editing/scrubbing the timeline.

Graphics cards would probably still work with 2 GTX 1080's, or to save a little bit, I could work with only one of them. As I said earlier on, the CUDA cores are utilized in Adobe Premiere, and opposed to a Quadro M4000, it has nearly double as many.

I've been looking into SSDs for the OS and the program files, and I think that Samsung 950 Pro really is the best choice. The mother board has an M.2 slot on it.

As for a motherboard, I think the Asus X99-Deluxe would be a good choice, but I'm still looking into other options also.

As for RAM, no more than 64GB of DDR4 will do me any good. 2400Mhz also seems like a safe bet, but I don't think I even need it to be that quick, as it will begin having negative returns.

I think that it's an overall good build for video editing.

In summary, this computer's main objective is...

Video editing in Adobe Premiere CS6 and Adobe After Effects CS6.
Specifically being able to move across the timeline quickly, and preview real-time video edits.

Secondarily...

Be used as a CAD workstation meant for 3D modeling, doing similar work as you would see in today's high-end videogames.
Again, being able to preview in real-time.
Development on Unreal Engine 4, with the same above requirements.

I do believe that the video editing, and the CAD workstation will be able to share a build without sacrificing on anything on either side, but if you see that there will have to be compromises, I can begin another build for the CAD instead.



The next computer is intended to specifically be a rendering computer. I would like to set up a network for this all to work on, but I have run into some problems, which I will explain, before going into the actual build.

The Adobe Premiere and Adobe After Effects programs don't support said networking? I'm still looking into it, but they don't have built in features to work across a server computer, I believe.

I'm not sure about the workarounds, but they need to work quicker than they would all on one machine.

As for the actual build.

I know that a lot of cores is good for video rendering. I suppose a dual Xeon build would be a good idea here. those 22-core CPUs, I suppose.

The graphics card is secondary? I don't think it will assist with rendering all too much, but this is where the second GTX 1080 could be used instead of doubling up on the editing build.

RAM is next, so I suppose between 32GB and 64GB will be more than enough. The same as the above build.

The same storage as above. The Samsung.

I would need a Dual Xeon Motherboard, and I suppose the earlier mentioned Z10PE D16 WS board will work.

I'm not too familiar with building servers, or a rendering system, but earlier, the two of you mentioned them, so I think you can help me.

In summary, this computers main objective is...


Rendering video projects, and Unreal Engine 4 projects quickly and efficiently.
The purpose is having the editing computer separate from the rendering computer. With this setup, a project can be finished the editing stage, sent to the rendering computer, and undergo the rendering process, while editing can continue with no slow down, on another project on the editing computer.

As a last note, I'm assuming it actually isn't a good idea that the server computer and the rendering computer are one in the same?



As for the last build, that takes me back to the Dual Xeon build we were working on.

This computer will be a general purpose computer for everything else, including music production, email, internet, etc.

The ability to have multiple processes running at once is the basic idea.

The E5 Xeon 2696/2699 V4 CPU will work for this purpose, even despite its low (base) clock speed, as its main purpose is the multiple core usage. I could work with just one, and upgrad to two of them later on.

The motherboard could be another ASUS Z10PE D16 WS, which I have already established is a working board for this build.

for RAM, even 32 GB of DDR4 should be more than enough.

The same Samsung storage again.

I could add a sound card, dedicated to the DAW programs.

Lastly, I could throw in another GTX 1080, as it will be more than enough for it's general purpose usage.

In summary, the objective of this computer is...

Working as a general purpose machine, for anything outside of video editing/rendering/UE4/server to run through.
Allowing the other builds to work independently without slowing anything down.
Music production, that will work quickly with Reaper 5 (DAW), and VSTs like Omnisphere 2.



In hindsight, I could be looking at up to four builds, but that is when I need everyone elses help.

I will keep researching, but all of your input would be appreciated in figuring out how many builds and how to give them each their own set of skills accurately.

Thank you, again!
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #154 on: January 24, 2017, 09:43:16 am »
Quadro cards are not faster than GeForce cards, that is a myth. Only 2 reasons to buy Quadro cards is the avoiding artifacts issue and support for more GDDR memory. In fact GeForce 1080 cards are a lot faster than Quadro M6000 and P6000, but Quadros have 24GB GDDR while Geforce only 8GB. And not to forget Geforce in 2-way SLI which boosts performance even further.

For Unreal 4, if you are creating your own assets, then you will need software like 3D Studio MAX or Maya and same GPU specs apply as for the other CAD software. For actual programming any device that can run the actual game will be good enough.

For video production, this same build can be used.

For rendering 3D assets and videos a 2nd computer could be great. That could actually be a server where you virtualize things like AD, SQL Server, Source Control including automated builds, rendering engine etc...BUT BE WARNED! HERE IS WHERE THE PRICE TAG WILL  GO UP A LOT! You will pay a lot for licences, Server licences for each CPU core, CALs for the servers and CALs for pretty much each and every software you add there. You will most likely pay at least as much in licence fees as the actual hardware build costs.


I would really start first with the workstation computer and make that one work well and after a while if the workflow has some bottlenecks because of just a single computer, then I'd go with a server as step 2.

I have an ancient laptop and an even more ancient desktop computer. I use the laptop for programming and I have even tested out how well it works with Unreal and Unity. No 4k games but for FHD it's fast enough. I have tried all Autodesk products on it and though the performance isn't that good, it's still good enough. The laptop has only 8GB RAM, SATA3 SSD, i5-2410M CPU and GF GT-540M card. If you look at those specs and compare to the devices you are going to stick into your build, they are 10-1000x faster each, depending on which component we are looking at. So please don't try to overkill this build. ;-)

The server part is for AD, SQL, Source Control and Storage. It only has 4GB RAM, 7200RPM HDDs and an ancient 64-bit, 4-core Pentium Core something CPU which does not even support virtualization. Everything is setup on the main OS (Server 2012 R2) which is actually a bad idea. :-) All works well and is fast enough for serving only me. But I'd prefer to have support for Virtualization, so I could use Hyper-V server as host, and then virtualize the AD, SQL, Source Control and NAS into separate VMs. I would need 8-cores and 16GB RAM for snappy performance. Thats it! So again, please don't overkill the server build part. :-)

For a third computer, you don't need it for anything else than backup storage, so keep it light. :-)

About GPU rendering vs CPU rendering, if it's true that Adobe software produce worse quality video on GPU than on CPU, then switch to some freeware rendering engine that does the job for you hundreds of times faster with lossless quality on GPU only. Adobe software used to really suck at this. Trust me they were terrible. I've tried them, tossed them in the bin and never looked back. I doubt they have improved that much. :-DD
« Last Edit: January 24, 2017, 09:53:17 am by slicendice »
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #155 on: January 24, 2017, 11:13:40 am »
Quote
6. If you insist on buying a single socket system now, or you are a big Intel fan (which I am), then I recommend you to buy a 6800k because it has the same spec, just with 28 lanes of PCIe, not 40 lanes. For a dual GPU configuration, 16+8 is more than enough. The rest 4 lanes is for NVMe SSD.

FWIW, if you intend to overclock, my experience is that the 6800k doesn't do very well at all compared to a 5820k which is the previous incarnation (both are LGA2011v3). All other things being equal, on the examples I have, the 6800k I can get stable to 4.2GHz with a rather high 1.45V Vcore, whereas the 5820k achieves 4.6GHz with little effort, and 4.7GHz with a bit of minor Vcore tweaking. The problem with overclocking on the 6800k doesn't appear to be temperature related, as it'll fail at reasonably low temps (~60C on liquid cooling).

If you're not overclocking, ignore this comment, as at stock speeds the 6800k will outperform the 5820k.
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #156 on: January 24, 2017, 11:41:17 am »
I now have an HB SLI bridge, so can re-run a couple of tests, although I'm not expecting much improvement. Frustratingly it's a 60mm (3 slot) bridge which means for the board I'm using one of the two GPUs will have to be on an x8 PCIe slot (it's x8 and x16 on alternate slots), so I'll have to get a baseline in the x8/x16 configuration too. Hopefully the baseline will be similar.

Anyway, if there are some specific simple benchmarks I should run, let me know and I'll include them.

FWIW, here is my setup (pic is with a standard SLI bridge). I have used air cooling for this because Xeons tend to be easier to cool than overclocked processors, and, perhaps more importantly, I have more faith in air cooling than I do in liquid cooling in terms of reliability. The processors run at full load at 60C. The disk FWIW is a Samsung 950 Pro NVMe M.2 mounted on an x4 PCIe board on the second CPU.

Edit: the Unigene Heaven tests were run at 3840 x 2160 resolution.

« Last Edit: January 24, 2017, 12:22:17 pm by Howardlong »
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #157 on: January 24, 2017, 07:54:57 pm »
Blueskull/slicendice: I watched this Linus video a little while ago...



The Quadro M6000 was indeed outperformed by the Titan X, and even further by x2 Titan X. He was going to do three, but then tripped the power supply on his build.

I was going to ask what possible benefit the Quadro would have over the Titan X or in my case (unless it would do worse) the 1080.

Running specifications on the M4000 vs the GTX 1080, the 1080 won out in basically all of its on paper specifications, which would line up with Linus' tests.

With that in mind, would getting a 1080, or if I really needed to, a Titan X really degrade the performance of the CAD operations? I intend to use, possibly, Blender/Maya for all of my 3D modeling, so with only a GTX 1080 will my model manipulation really slow down significantly?

As for the video editing, will getting a M4000 affect negatively the video editing experience? Slowing down the scrubbing timeline, or slowing down previewing, or would it improve?

In the Linus video above, it would appear that even the GTX 1080 could outperform the M4000.


slicendice: As far as the NAS is concerned, would building up a SAN instead cost less. Really, what I need is an editing computer and a rendering computer, that will both have access to the same source files.

If I build up an editing computer and a rendering computer, and run them through a home network system to a third "storage" computer, then they will both have access to the same files/projects/assets. I just need to run it all on a fast network.

I assume I could go with the bare minimum on a storage computer that the editing/rendering computers would have access to, right? A server board would be enough, or a board where I can (as time goes on) continue to add more and more SSD/HDD as storage where the editing/rendering computers will have access to files from. I don't think I will need GPU really, and is there any affect that more than a basic CPU, and little RAM would make? It's only meant for accessing files?


Now, in Adobe Premiere, CPU is the key most definitely. However, Premiere does use GPU acceleration on certain effects and transitions, as well as using the GPU to render these affects using GPU acceleration. In After Effects, GPU can be utilized further than in Premiere, since it is a graphics based program. I do assume I will be using After Effects a lot as well.

I think it looks like this...

CPU>GPU: Adobe Premiere Pro CS6
CPU=GPU: Adobe After Effects CS6

blueskull: Are there any other documented cases of the Samsung drives being prone to failure? I was originally going to go for the Intel 750 series because of bad past experience with Samsung products, but the 950 seemed to be the fastest possible, so I had changed my mind. I want my SSD to survive longer than a year or even less.

Howardlong: I probably will be overclocking, so thank you for that information. I was indeed hoping to get to a (stable) 4+Ghz out of my CPU, as that would really boost my performance, especially on the editing machine. I will look into that CPU

The only additional benchmark that I think I need would be a render time, but if that is too much, you don't have to worry about it.



What I think I'm looking at again, in quick summary... (unless it isn't looking good, let me know)


An editing computer... secondarily a CAD/3D modeling computer.

It needs to...

Move around the timeline quickly/seamlessly.
Real time previewing quickly/seamlessly.
Manipulate 3D models quickly/seamlessly.


A storage computer...

It needs to...

Store all Adobe Premiere Pro/After Effects files/projects for access on the editing computer and then the rendering computer.
Access of these files must run quickly, not be slower than on a single machine.


A rendering computer...

It needs to...

Render and encode high quality video files (between 1080p and 4K) quickly and seamlessly.
Be able to render/export high quality 3D models for Unreal Engine 4 quickly.
Be able to export Unreal Engine 4 completed projects.


A general purpose/music production/multi tasking computer...

This one I pretty much have figured out, it will wind up being the build we were putting together earlier with the ASUS board and the Dual Xeon E5-2696/2699 v4s.

Thank you again, for continuing to help me with these builds.



Does anyone think I should start a fresh topic on this, since the plans have changed from what they were in the beginning?
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #158 on: January 24, 2017, 09:54:14 pm »
If you can let me have the details of an easily reproducible render test I can run here, I'd be happy to run it.

Some interesting scores on Unigine Heaven this evening (all tests on a single CPU's PCIe lanes)...

SLI standard bridge x16+x16: 73.2 fps
SLI standard bridge x16+x8: 71.5 fps
SLI high bandwidth bridge x16+x8: 71.2 fps

So the $40 HB bridge for this test functions better as a paper weight. I looked at the xrays of these HB bridges last night. Although they have some additional thought put into the PCB layout, their primary function is as a passive device. A nice way to make a few bucks, and the way they're constructed and presented in the packaging almost puts them into Audiophool territory.

I have a 40mm SLI HB bridge coming my way tomorrow to allow an x16+x16 HB SLI test, but I can't help but feel I've been shafted already!
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #159 on: January 24, 2017, 11:11:47 pm »
Here are two rendering tests that I've found...

https://render.otoy.com/octanebench/

https://corona-renderer.com/benchmark/

Those are indeed interesting results. The 16x16 did the best, and even better than the HB test showed in 16x8.

As for your HB 16x16 on the way, is it possible that the rest of your setup might not be taking full advantage of your HB hardware? Money going to waste is a terrible thing...
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #160 on: January 24, 2017, 11:37:27 pm »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #161 on: January 24, 2017, 11:49:56 pm »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #162 on: January 24, 2017, 11:53:31 pm »
I am not sure that I am following?

Is it not possible to have a storage computer where access will be quick enough between the two computers?
 

Offline Fsck

  • Super Contributor
  • ***
  • Posts: 1157
  • Country: ca
  • sleep deprived
Re: Proofreading my computer design and building a server rack station.
« Reply #163 on: January 25, 2017, 12:40:21 am »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.

assuming that you only need a point to point connection, that's only 60$ USD. 30$ per adapter (ebay, of course), usually include a short DAC (direct attach copper) cable.
if you really hunt, you might be able to shave it down a few bucks, but that's a trivial amount compared to the rest of the setup.

ethernet is inherently high latency. low latency connections use infiniband (IB).
You need to define quick in terms of throughput and latency.
40G would probably cost you 300-600$ USD for the setup, probably not worth it.
« Last Edit: January 25, 2017, 12:43:23 am by Fsck »
"This is a one line proof...if we start sufficiently far to the left."
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #164 on: January 25, 2017, 01:06:36 am »
I'm watching videos and reading some articles right now, but in case of anything, can someone please give me a quick explanation of setting up this network?

What equipment it will require.
What will the costs be for something fast enough.
Ethernet vs Infiniband. (I know Linus uses an Ethernet server.)

Thank you, I will be back in a little while.
 

Offline David Hess

  • Super Contributor
  • ***
  • Posts: 16615
  • Country: us
  • DavidH
Re: Proofreading my computer design and building a server rack station.
« Reply #165 on: January 25, 2017, 03:07:53 am »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.

Doesn't that depend on the link aggregation mode?  Round-robin should allow a single stream to take advantage of the combined throughput.

I would consider doing this with 1Gbps ports.  10Gbps ports are still too expensive for most budgets.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #166 on: January 25, 2017, 03:22:25 am »
For your storage solutions, if you want it to be really fast, you need a bunch of 10Gbit Network Cards  and a Switch with similar plugs that supports trunking. 2GB/s should be enough, right? Those devices cost a lot. I think in this case the bottle neck will become the rendering speed, not the transfer speed.  :-DD

A single stream will only achieve 10Gbps.

Doesn't that depend on the link aggregation mode?  Round-robin should allow a single stream to take advantage of the combined throughput.

I would consider doing this with 1Gbps ports.  10Gbps ports are still too expensive for most budgets.

LACP does not support round-robin. Anything else results in fun and games - and the switch behaviour is likely to be undefined.

Trunking works great for one-to-many, not so much one-to-one.

10GigE can be had for around $50 for older -CX4 cards, but they're very range limited. 10GBASE-T will run more along the lines of $100-150 for a card, but normal cable can be used at decent ranges. Not unattractive pricing if you need it - NBASE-T should improve matters in the near future, hopefully.
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #167 on: January 25, 2017, 03:55:10 am »
I think I was indeed confused about something. A NAS is what I need, correct?

The basic idea is that both my editing PC and my rendering PC have direct access to the same files, without any copying.

For instance...

PC 1 = Editing
PC 2 = Rendering
PC S = Storage

I want to start a movie project on PC 1, but I want all video clips to be stored on PC S, and the project file once finished, I want to be stored on PC S as well.

With all files on PC S, I want PC 2 to be able to access these files immediately after the project is saved. Once the project is saved and finished on PC 1, then PC 2 can begin the rendering.

Neither PC 1 or PC 2 will store any files, but share the same storage system that is PC S, so that both PCs can access the projects without extremely long copy/paste times.


I think this should be possible with a NAS, right?

I would need both PCs to run through a switch, correct? They would then connect to the storage, which would have a network card. Ethernet cables should be used?

I want whatever is fastest for file access between two PCs. I will be constantly reading and writing to this storage device, but I believe the read speeds are most important. However many Gbps that should be?

It needs to not make the rendering process lag behind, making it slower than if I had just copied the files over and then rendered them. I think this should possible right?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #168 on: January 25, 2017, 06:08:46 am »
Those NAS and SAN terms are confusing, what you need is something where you can expand storage on the go and create iSCSI disks to be shared between computers. So basically it's a combination of NAS and SAN. Lets not get too technical and too much in to the nittygritty of the differences between the two. :-D
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #169 on: January 25, 2017, 06:10:50 am »
Those NAS and SAN terms are confusing, what you need is something where you can expand storage on the go and create iSCSI disks to be shared between computers. So basically it's a combination of NAS and SAN. Lets not get too technical and too much in to the nittygritty of the differences between the two. :-D

And what filesystem are you suggesting to use for this NASAN?
 

Offline Lizzie_Jo_Computers_11Topic starter

  • Regular Contributor
  • *
  • Posts: 89
  • Country: us
Re: Proofreading my computer design and building a server rack station.
« Reply #170 on: January 25, 2017, 06:47:08 am »
I am somewhat familiar with the term iSCSI. NAS or SAN, I need to know exactly what kind of equipment I should start looking into, so thank you slicendice for beginning to clear this up for me.

Of course, worth mentioning is the fact that I only need this to work for these two PCs if anything. I currently don't need an entire business network of 20+ computers, so keeping it simple and keeping data transfer/access quick is the best option.

Please do warn me however, if this will wind up a slower process than moving the files with an external hard drive or anything of the like. I don't want a drop in performance if I will be spending so much on the four total PCs. Otherwise there would be no point.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #171 on: January 25, 2017, 07:09:04 am »
And what filesystem are you suggesting to use for this NASAN?

Yes, NASANEXPERIMENTALALPHAHYPERULTRADELUXEFS!

Seriously, what kind of question is that? Can you create NASAN FS on your Windows Box? :-DD

I said let's not get too technical. We only need something that can hold hard drives, preferably in a storage pool but other options are great too, where we can add and remove physical drives on the go, and create and remove iSCSI disks or shares, that other systems can connect to over a network. This drive pool can be on one of the computers or on a separate NAS device, does not really matter, as long as every computer in the network can access them with decent transfer speeds.


If on Windows, then NTFS would be the best option, if on Linux then EXT3/4 would be great, but as we are not going to use any Linux, we can forget about EXT3/4.


EDIT: Sorry, I misunderstood the question, but my answer is above. NTFS!
« Last Edit: January 25, 2017, 07:10:51 am by slicendice »
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #172 on: January 25, 2017, 07:10:28 am »
And what filesystem are you suggesting to use for this NASAN?

Yes, NASANEXPERIMENTALALPHAHYPERULTRADELUXEFS!

Seriously, what kind of question is that? Can you create NASAN FS on your Windows Box? :-DD

I said let's not get too technical. We only need something that can hold hard drives, preferably in a storage pool but other options are great too, where we can add and remove physical drives on the go, and create and remove iSCSI disks or shares, that other systems can connect to over a network. This drive pool can be on one of the computers or on a separate NAS device, does not really matter, as long as every computer in the network can access them with decent transfer speeds.

If on Windows, then NTFS would be the best option, if on Linux then EXT3/4 would be great, but as we are not going to use any Linux, we can forget about EXT3/4.

If you want to share a single iSCSI target between machines you're going to need a clustering filesystem. Which one do you suggest to use on a desktop Windows system?
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Proofreading my computer design and building a server rack station.
« Reply #173 on: January 25, 2017, 07:21:18 am »
If the fear of data corruption is an issue then, NFS would be a good option, so we can enable file locks.
 

Offline Monkeh

  • Super Contributor
  • ***
  • Posts: 7992
  • Country: gb
Re: Proofreading my computer design and building a server rack station.
« Reply #174 on: January 25, 2017, 07:24:13 am »
If the fear of data corruption is an issue then, NFS would be a good option, so we can enable file locks.

So a typical distributed filesystem. Not a clustering one. No iSCSI needed, then, so.. NAS, not SAN.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf