| General > General Technical Chat |
| Goodbye Windows, Hello Linux [advice needed for a Linux workstation at home] |
| << < (18/46) > >> |
| particleman:
I have used Debian since 2006. I have used Gnome shell for many years now and it seems to do what I need. I tried Arch and KDE for a few years on a laptop it was really nice but life came up and I got too busy to constantly tinker so I went back to Debian. Linux is awesome. Not having to worry abut anti-virus is pretty cool along with no system crashes, awesome up time and installs that dont really creep up in size over the years. My current install is from 20011 and is using under 19GB. Still rocking my AMD 8150. I wont upgrade until something dies beyond repair. |
| NiHaoMike:
--- Quote from: Nominal Animal on January 19, 2019, 03:00:43 am ---If you use a Linux machine, and do not configure it to your needs, then you are essentially using a screwdriver to hammer in nails. If you want to get shit done, you configure the machine to work for you, or you pay someone else to do it for you, or it will suck. That is the cost of free/open source software; deal with it. --- End quote --- Windows also needs to be configured to work the way you want it to work. |
| Nominal Animal:
--- Quote from: NiHaoMike on January 19, 2019, 04:10:05 am ---Windows also needs to be configured to work the way you want it to work. --- End quote --- Can you, in general? I do admit that the cost in time, effort, and knowledge is high, but with Linux, I can. It is never about whether something is possible for me, it is always just a question of balance: whether something is worth the effort. Because I do prefer programming over administration, I tend to over-amortise the development costs (because of the modular approach, and the assumption that if useful and done well, it will be useful for others/later on as well). I do have submitted patches all over, from the C library to the kernel to userspace applications, and can write a new kernel driver, service daemon, or an UI application if needed; that is the high cost part. Paying someone else to do it can cost a lot of money. When I maintained Windows and Mac machines (granted, almost two decades ago), customization was very limited. Most useful/efficient solutions I implemented ended up having at least one part that worked around implicit assumptions in the OSes. Even visual customization was limited. Nowadays I do my own bootsplashes and for fun (my favourite being Tux, my avatar, rolling its eyes during bootup). It's the equivalent of other people putting stickers or decals on their machines, I guess. |
| radar_macgyver:
--- Quote from: RoGeorge on January 18, 2019, 03:03:53 pm --- --- End quote --- 24 fps seems rather low. If vsync is on, confirm that the big Acer B326K is not forcing everything to run at 24 fps. Some combinations of video card, display and cable may end up operating at a much reduced refresh rate. If so, I'm not surprised you're getting tearing. The canonical program I use for testing if acceleration works is glxgears. With vsync turned off and the default window size, I get ~18000 fps on my Fedora 28 box with an NVIDIA GTX950. With vsync on, it's fixed at 60 fps. I haven't noticed any tearing when watching video, although my display is 'only' 1440p. |
| RoGeorge:
24 FPS is the movie frame-rate (23.976). My monitors are all 60Hz. Vsync also works at 60Hz, I guess. This needs a framerate conversion from 24.976Hz to 60Hz. What I noticed is, when only one monitor is active and video is full screen, then the BLIT changes to FLIP. BLIT means a bitmap image is copied at a certain location in the video screen RAM (asynchronous, I guess), while FLIP means a full image is built in a buffer, and the buffer is flipped to display during a vertical sync, synchronous with the monitor's frame rate. BLIP or FLIP, a framerate conversion is still required from 24 to 60fps. When properly implemented, the frame-rate conversion must be done (on the fly), and with frame interpolation. Without frame interpolation, the image will stutter here and there, because 60Hz is not a multiple of 23.976Hz. I don't know who's job the frame rate conversion is, the GPU driver, the movie player/decoder, or maybe some other window composer. |
| Navigation |
| Message Index |
| Next page |
| Previous page |