I’ve been working exclusively from a laptop for the past decade. It was the only option, since I was traveling so much. But now that I’m mostly home, I see how little sense working from that MacBook is. The screen is too small; I have to scroll forever, and slouch forward unless I want to struggle and squint my eye. The butterfly keyboard is bad; too many typing errors that consume time. And the computer is too slow; it hangs up and stutters when too many heavy browser tabs are open simultaneously.
In short: I could use a new computer, a machine that wouldn’t be slower than me, thus not hindering my productivity. So I had this idea:
Why won’t I figure out how to build the best computer for working from home? The reasoning was: for me, every small increase in productivity will directly result in higher lifetime passive income. Ever small 1% bump in productivity is worth a fortune over the long run. It justifies almost any cost.
It would be foolish not to do so then.
So I set out on a quest to do it, and learned more than I ever wanted to know about keyboard switches, the human eye lens, or VRM phases.
Well, here’s the result.
Best Productivity Machine
Yes, my computer uses are simple: I write and edit my articles, books, and sales copy. I research with a million open browser tabs. I work on my sites and businesses. But even if your uses are heavier (programming, video editing, graphic design), this guide will still lead you down the right direction.
To build that ultimate workstation, you’ll have to buy each component and then assemble it all altogether. It’s not as hard as it sounds. And it’s worth it. You’ll have a machine that’ll maximize your output per hour.
For most productivity use cases, the monitor is one of the most (if not the most) important component. Just take a look at Pfeiffer’s research:
Increasing screen real estate will increase overall productivity, even in every-day tasks. As the paper says, “We instinctively feel more at ease with more screen space, just as we prefer to have a larger work table rather than a small one that forces us to move things around constantly.” When working on a computer, we lose much more time than we realize through our interactions with it. A display that eliminates the need to shuffle windows, open and close palettes, zoom in and out, etc, will increase our productivity.
The research tested tasks like editing text, formatting spreadsheets, or retouching images; others focused on interapplication integration, measuring the impact of a large display on work involving two individual programs. It presents key data on ROI (return on investment) based on the cumulative effect of small, incremental productivity gains over time. Exactly what I was looking for!
The major findings:
- Computer displays are a widely overlooked productivity factor of the personal computer, and they can contribute significantly to productivity, efficiency, and overall throughput. High-resolution displays can result in measurable productivity gains.
- Productivity gains were present not only in digital imaging, video or design applications, but also in office applications such as word processors and spreadsheets, as well as in the general personal productivity of the computing environment.
- A larger display area often results in new productivity strategies that make best use of the display in ways that one cannot easily imagine when working on a smaller display.
- Seemingly small productivity gains (linked to a large, high-resolution display) on frequently repeated operations can result in a significant return on investment (ROI) over time.
The impact a display has on your work is often underestimated. Think of it as a tool. With a big monitor and a high resolution, you can keep numerous windows or applications open all at the same time, without having to jump between each other. Many applications make use of every inch of even the largest resolution. The ability to view multiple detailed documents without having to zoom in and out or scroll provides a very comprehensive experience, and immediate productivity gains. It will take you less time to combine and position elements from two or more different images/documents, etc. This results in a smoother workflow and less time spent managing windows. Love it!
It’s simple. With a larger resolution display, you can simply see more information. If you write, you will be more efficient because you can see more of the test. If you compare two documents, you will be faster because you can see both the original text and the new one next to each other without having to shuffle windows around. Being able to have three full web pages next to each other makes researching a breeze. In the research, it took longer on a smaller screen to combine information from a spreadsheet with a word processing document. It took three times longer to position image elements on Photoshop because of the constant zooming and planning. On a large resolution display, you can easily work on a large spreadsheet while having a web page and your email client all open and visible at the same time. This results in significant productivity improvements.
I believe the monitor is what affects my productivity the most. A small resolution or screen slows me down, because there’s just not enough real estate to keep everything I’m working on open, so you have to start moving things around, scrolling, etc. This just takes too much precious time.
With a smaller screen and a higher resolution, you’d have to scale the UI. Operating systems still didn’t nail this process down perfectly. With Linux, for instance, anything outside 200% scale looks blurry.
An AOC/ViewSonic 31.5″ 2K (2560×1440). PPI is 93, so text will look plenty big and sweet, but not too grainy. Has 70% more screen real estate than Full HD. Other option is a 4K 27″ Philips, but I’d have to scale everything 200%, so text will look very sharp (4K scaled x2, to Full HD), but I’d be working on an 1920×1080 real estate. Not large enough. Third option is a 42.5″ 4K monitor, giving me 103~ PPI. Great text size, great resolution, and great screen size.
I used to work on a 5K 27″ iMac back in the day. But the text would look too small in native 5K (5120 x 2880) on a 27″, which is why Mac OS scales the resolution down to 2560×1440. What’s the point? I am getting sharper text, yes, but I don’t get to enjoy the productivity of a big resolution.
So a large screen (or multiple screens) is a must. I considered a couple options:
Here’s the overall setup dimensions and specs for each:
#1: 42.5″ TV – 3840×2160 (8.3m pixels at 103PPI) – 94x53cm
#2: Dual 24″ – 1200×1920 (4.6m pixels at 94.5PPI) – 72x55cm
#3: Triple 21.5″ – 1080×1920 (6.2m pixels at 102.5PPI) – 88x49cm
#4: 31.5″ – 2560×1440 (3.7m pixels at 93.25PPI) – 70x39cm
#5: Dual 23.5″ – 1080×1920 (4.15m pixels at 93.75PPI) – 60x52cm
#6: Dual 21.5″ – 1080×1920 (4.15m pixels at 102.5PPI) – 59x49cm
#7: 34″ or 35″ – 3440×1440 (4.95m px at 106.5-109.5PPI) – 80-85×33-39cm
As you see, a large 4K screen made the most sense (amount of pixels per dollars). It would essentially be four FHD (1920×1080) screens combined! All you have to do is arrange your windows in that one big screen in such a way that it is like multiple separate “monitors”, only without the ugly bezels (which, tho, for some people act as “mental borders” between tasks).
I chose to go with a 4K 70″ TV.
TVs these days are close siblings of PC monitors. The difference? TVs come with a bunch of image processing stuff and Smart features, which you can cancel. But the panels are similar. And if you can find yourself a TV that supports 4:4:4 chroma subsampling (most of them do), then you got yourself a TV that can display text properly and function just like a PC monitor.
- Low input lag.
- 4:4:4 chroma. This is important for PC use, because it allows for 1:1 pixel mapping. Movies from Blu-ray and 4K Blu-ray are encoded in 4:2:0 and upscaled to 4:2:2, then sent to the TV, and the TV maps it into however it’s programmed to do. 4:2:0 means the color resolution (chroma) of the screen is 1/4 the brightness (luma) resolution. So for 1920×1080 resolution, the color resolution is only 960×540. Very small! This saves on disc space, and provides nearly equivalent fidelity at 4:4:4. The drawback to 4:2:0 encoding (which almost all videos use – including Blu-ray, YouTube, etc) is lower clarity on high-frequency textures. 4K is a major upgrade, as it’s 3840×2160 luma, so you get 1920×1080 chroma. PC outputs 4:4:4 natively, it does not use chroma subsampling as it is a source, it doesn’t need the compression. So if you imagine colored text, if it’s missing 3/4 of its color resolution, it will look blurry because they’re small and often only 1-2 pixels wide.
- IPS panel. IPS gives you good viewing angles, but the problem is light leakage, resulting in low contrast ratio (brightest white vs darkest black). If you set the monitor to 100 nits, a 1000:1 ratio, that means 100:1000 nit blacks, or 0.1nit blacks. The best of the best IPS today produce ~1500:1 contrast ratio, the MacBook’s next gen IPS can do 1800:1, that’s VERY high for IPS. However it’s Abysmal compared to other Technologies. VA-panel PC monitors produce ~3000:1 contrast ratio. VA-panel TVs produce between 4000:1 to 7000:1. The drawback to VA is that it has very narrow viewing angle, so you have to be in a sweet spot for everything to look good. OLED has the best viewing angles (wider than IPS), and also the darkest black. It is an emissive display, which means it has infinite:1 contrast ratio. This gives the image depth, and generally professional movie studio content are color graded on OLEDs. Hisense’s LMCL will change everything. It has two LCD layers. so you get ~125000:1 contrast, which is as good as OLED. But it can get much brighter than OLED. LMCL is not out yet except for professionals, $45K for a 32inch screen and they blow away everything on the market. If you’re not in a hurry, wait for LMCL. If you’re in a hurry, and you intend to do a lot of movie watching on the PC, get a VA-type LCD or OLED.
If you go for VA-panel TV, check Rting’s reviews, you can trust their recommendations based on your budget.
If you intend to do coloring work, you’ll need to stick with VA-panel TV and buy a colorimeter, i1display pro, or i1 studio, or colormunki display.
A word on pixels per inch (PPI). Regardless of resolutions and screen sizes, the lower the PPI, the larger the text and the farther away you’d have to be from the monitor for the image not to look pixelated. The higher the PPI, the smaller and sharper the text would be. For most folks, PPI between the range of 90 to 110 would be best at normal sitting range. For example, both 22″ at Full HD (1920×1080) and 4K (3840×2160) at 43″ are about the same PPI (102-103), and are pretty optimal for most people. The text would be the same size on both. But if had a 32″ 4K monitor, the PPI would be much higher, and it would be very difficult to read the text displayed without scaling (magnifying) it ~150% or so. Plus once you scale to 150% you’re no better off than using a cheaper 2560×1080 monitor. You lose the benefit of a large resolution’s real estate. Check the PPI of each resolution / screen size here.
27″, for example, is a terrible size for 4K displays. At native resolution everything is too small, and at 2x it’s cartoonishly large. So you’d want to go with a large enough 4K monitor to enjoy its native, large resolution (such as 4K) without having to scale the text.
BUT – we actually ARE going to deliberately use a low PPI monitor, even if that means having gigantic text up close, because that would allow us far away from the monitor, where the text would be readable and not pixelated. And our eyes would be at infinity focus ;)
So consider getting a huge TV (60 and above) as a computer monitor (PPI is going to be very low), and simply move the desk to the middle or back of the room. This is best for your eyes, because as soon as your eyes focus on objects at greater than 1-1.5 meters, the lens is at configuration for infinity focus, which means the ciliary muscles (which contract the lens for near focus) are completely at rest. It’s when the eyes attempt to keep things in focus under the strains of up-close work, where we get fatigued and tired.
As far as I can tell, tp4tissue from geekhack.org is the first to discuss the combination of factors such as eye-vergence and focal distance, at least online. BIG screen were not practical not too long ago, but now we have affordable big monitors (read: TVs).
There’s no ongoing research into this because of office configurations. The primary use of long, daily computing is done at work, and most offices wouldn’t be able to space efficiently if everyone needed 1.5-2 meters clear distance to screen.
But better configurations can be achieved.
But it would be pretty expensive.
For example, x3 curved Samsung 75″ quantum TVs at 4-5 meters would be best for monitors under tp4tissue‘s criteria, but these will probably never be made again as curved high-end big panels have been discontinued. The drawback of samsung is their TV OS/UI, since they’re not designed for PC use, they have dynamic backlights which would mean serious photoshop or video color grading is out of the question. Samsung does not respect Color, unlike companies such as Panasonic or Sony. Not sure if one of them went very deep into curved LCDs.
Curved is a nice to have because it reduces contrast drift for VA-type LCDs, but also reduces stochastic eye movement, and focus changes. If you don’t need good image quality, there are IPS-type LG TVs with their nano particles, but these are special dopes with irregular spectral density. It makes color calibration problematic without a spectrometer, a ~$5K tool. whereas Colorimeters are only $250.
The top option today absent curved big panels, is the 77″ OLEDs. Still quite affordable for what they are, but because their safe brightness is only ~130nits, you can’t use them in a bright office, unless you’re ok with fast burn-in from computer static screen elements. OLEDs are also difficult to color calibrate for computer use as their ABL limiter starts very low in the brightness range. Samsung still makes their lower end curved TVs but they don’t have very stable backlights and they don’t have wide gamut quantum colors.
Stepping down, there’s the flat quantum LCDs, You don’t want Samsung for these, you want something with a Roku platform, because their UI is superior for computer compatibility, and they’re less conflated with post processing which makes color calibration difficult.
Hopefully, at the end of this year we’ll get the LMCL panels which in my opinion obliterates OLEDs for monitor use, 500nits full field, 0.004nits black level. Complete game changer. You can almost use these outdoors.
If you like the thought of dual monitors, where you can have two apps maximized instead of having to constantly shuffle windows around on one giant screen, maybe you should go for two. Or just use a tiling window manager (more on that later). But some folks see their productivity fall when trying to split attention between two screens. Some feel more productive on two monitors – one for having your main app open and one screen for supporting apps / documentation, browser, etc. But you can achieve the same effect with a massive, single monitor, especially if you’re using a tilling window manager instead of regularly managing windows.
Some developers like using a monitor vertically (portrait mode). It gives a higher range to match your neck, which is more ergonomic. It allows for many more lines on the screen, and runs a bit like having a long page of code. Most codes don’t have long lines, so horizontal space is wasted space. And most importantly – less distraction. You’re purely focused on what you see. Scrolling vertical also feels more natural. If that sounds good, go for it. If you use software with multiple panes open, etc, then horizontal would be better. Here’s how 3 monitors in a vertical setup looks like:
Alternatively, use a tiling window manager or some sort of software to split your desktop screen space into multiple parts. On a big monitor, that’ll allow you to achieve the same thing, without the physical bezels separating the windows. Although some people find them mentally useful, helping to separate tasks in their mind.
Overall, if your’e working mostly with text and documents, then portrait (vertical) mode is valuable. Monitor aspect ratios are built to fit the format of movies / games. But if all you’re doing is text, then vertical pixels are much more valuable. Only thing about width after you’ve got enough vertical pixels.
I wanted something that was built specifically to maximize words per minute output, a keyboard with minimal travel time and good tactile feedback. And without that ugly RGB lights bullshit. For that reason I turned to the Scrivener community (professional writers primarily) and asked for their opinions.
I got an enormous amount of responses, with a myriad of recommendations. I made a list of them all so I don’t forget: Das Keyboard (Cherry switches), Filco Majestouch, Anne Pro 2 or Keychron K2 / K4 (Cherry or Gateron switches), Leopold, Matias (Matias Click Switches), HHKB and Realforce (Topre switches), Azio, Keyboardio Atreus, IBM Model M, even the old IBM Model M, or the odd OLKB Planck keyboard:
The HHKB Classic, for example, was engineered in such a way that your palms never leave the home row (the mid row), so your hands move less. Combine that with DVORAK instead of QWERTY and you should be having a true typing machine. If you can increase your WPM (words per minute) from 70 to 200, that’s an enormous productivity increase.
Here are things to consider:
- Keyboards are a unique, personal component. Like a guitar, they are a device you are likely to keep with you for as long as it lasts. Once you get used to a keyboard, you don’t want to replace it. Therefore, invest in one with a solid build quality that’ll serve you for years.
- Switches! You want to aim for clicky, mechanical switches. Their feedback will help you have a better “feeling” of the keyboard and avoid time consuming typing errors. MX Cherry, Matias Click (here’s how they compare) and Topre switches all have their raving fans. Someone told me I should also try out Outemu Blue. Try them out (different switch tester kits on Amazon) and see which one you like best!
I chose a tenkeyless (a full sized layout, just without the Numpad) because I never use the Numpad.
If you’re going to sit very far from the computer (like with a massive monitor), you need a wireless keyboard. Opt for one with a good battery life for increased life quality.
As I said, my uses aren’t heavy. You’ll often find me with 10-20 open browser tabs, a bunch of spreadsheets, and maybe a couple big text files. This is mostly just memory-heavy. Even an i5 CPU could stand all that. But I want something a little future-proof. I might dive a bit into Python/Java, or do some image editing on GIMP/Krita. Or who knows what.
It takes a CPU a certain amount of time to render anything – game frame, web page, etc. A stronger CPU can do things in less time. An 8 core 3700X / i9-9900 can render a page in 1 second, whereas a 4 core 3400G or an i3 might take 2 seconds. To you, you blinked anti was over with. Can’t really say just exactly how much of a difference there is on such a small scale. But when it comes to large scale, that’s a different story. Work on some project while compiling code at the same time and the 4 cores just got swamped, while an 8 core wouldn’t even blink.
AMD CPUs’ cache is generally higher than Intel’s, which give them the edge at productivity-based uses (pretty much any software that isn’t a game).
Intel could support faster memories but they chose not to, because that allows them to sell pricier chipsets that do support faster memories.
I think a 6-core 12-thread CPU, like a Core i5 or a Ryzen 3600, would make a lot of sense in my case. However, I still went with an i9-9900. Reason? I got it for 50% off.
Also, the i9 (like most Intel CPUs) comes with an integrated graphic card, sparing me the need to buy a separate one that would not only cost money, but would also increase heat output and noise levels. I always prefer an onboard GPU for my uses.
The i9-9900 is a very fast 8 core 16 thread CPU that will blaze through all my workloads and even though a bit overkill it will more than see me through many years of use and will only get better as I throw more workloads at it. At a 50% discount, I can’t go wrong so might as well enjoy the power I’ll have on tap.
But really, in a productivity machine for someone like me (non image/video editor), it’s not going to be so much what the CPU can do, but what the supporting equipment will allow the CPU to do. Low RAM, slow storage, limited peripherals ability, lack of options. All that just adds up to one thing: Extended nap times, which we want to avoid.
As for you, if you don’t have the need for a strong CPU (like me) and don’t have access to some heavy discount like I did, perhaps you better just skip it. For example, do you compile Python code every day? Then a stronger CPU like the 3700X will have time gains over something like the 3600, but if it’s just once a week, the extra few minutes saved don’t mean much. It’s nothing more than work done while you are in the bathroom, or getting something to drink. On every-day, simple tasks (like opening web pages, editing text, etc), I seriously doubt with everything being the same there’d be any noticeable difference whatsoever as you’d not be saturating the equipment. The instructions per clock being so close with the CPUs that humans can’t see a difference. You are talking about nanosecond differences.
An old CPU might be able to deal with 1,000 instructions per 1 clock cycle, and at 4.2GHz there’s 4.2 billion cycles. A newer CPU might handle 2000 instructions per clock cycle and be at 5.0GHz or 5 billion cycles. You are into nanoseconds in time per instruction. You can’t see the difference in anything less than a second without 2 comparisons, and even then maybe not. The differences between the CPUs go much farther but can’t really compare AMD to Intel on a 1 to 1 basis as their architecture is different, how they deal with things is different etc. There’s also core count, bandwidth per core, memory speeds, memory controller speeds, Lcache and a host of other factors in a cpu. A CPU like the 3700x has more cores, higher Lcache, better memory controller, etc, so is all around a stronger CPU, even if it’s architecture design is similar to the 3400G. There’s also a microprocessor count too, there’s a lot more transistors on a 3700x than there is on a 3400G, so it’s faster all around.
I picked the Noctua DH-15 as my cooler of choice. Reason? The i9-9900 is a very hot CPU at full load, first of all. But there are reasons beyond that. I wanted my system to be as silent as possible, and I know it’s hard to beat Noctua at that. I picked their largest cooler. Seriously, it is a monster of a cooler, I’ve never seen an air cooler this size. But since I don’t plan to use a case (just bolt it onto the upper shelf of my vitrine), it will fit anyway. And the heatsink is so large that the fans don’t have to work very hard. I can’t even hear it spinning. My system is dead silent, and very cool.
Everything in my day takes massive amounts of RAM. Gazillion tabs? RAM. Massive text files? RAM. Lots of images, PHP files, heavy browser usage? RAM. 16gb wasn’t going to cut it. 16gb is the recommended size for average gaming, where one game and maybe a few apps are running simultaneously and soaking up 6Gb-13Gb on average heavy use. For any productivity uses, I’d be looking at 32Gb-64Gb.
I got myself a 32gb kit (16×2) from Silicon Power. I got the ones without any heatsink on (RAM doesn’t need cooling, plus those ‘gamer’ heatsinks are childish). I got DDR4-2666, but faster RAM got cheap. Often the price difference between different speeds is minimal, so get the fastest ones. Sweet spot these days seems to be 3200 to 3600. Notice that you can enable those faster speeds through the motherboard’s XMP profiles. But don’t be stuck on it. I doubt those RAM speeds are going to make any discernible difference.
You want a motherboard with strong, stable power delivery. This is your productivity machine! Stability is king. Any downtime and you’re losing time that could be spent producing.
Look for a motherboard with good VRM. Some boards come with VRM doublers that distribute the power amongst the two lanes of MOSFETs, chokes and capacitors available to each of them. Usually, the PWM controller sees every two lanes controlled by a doubler as one. This allows controllers with support for up to 6 phase VRMs to be used in 12 phase VRM designs using doublers. While still better than regular six-phase parts, they’re not as efficient as true 8 or 10 phase VRMs. They induce a delay and reduce the frequency of the supplied current in half. Furthermore, only one of the two can be switched on at a time. The first one sees a modest delay but the second one is usually delayed by half a cycle, and in terms of precision, even the former is rather substantial. While multi-phase VRMs kick in instantly, or one after the other without a notable delay in between, pseudo-phases or doublers induce a latency which reduces the overall efficiency. Again, a 5 phase VRM doubled to 10 is less efficient than a native 7 phase or 8 phase VRM. The reason as already explained is that the doublers induce a small delay to the PWM signals. Regardless, a 10 phase (doubled) VRM is still better than a 5 phase and is a cheap trick to allow a higher power draw.
Don’t be fooled by flashy VRM heatsinks. These are mostly decorative and are clamped on top of the VRM’s plastic/epoxy encapsulation. Most of the real VRM heatsinking is provided by the motherboard’s power and ground planes.
VRM will heat up and thermally throttle, reducing the CPU speed.
I chose to go with a Gigabyte AORUS Z390 Mini-ITX. Reasons? The MiniITX Z390 boards are the only ones that come with an HDMI2.0 port, which would allow us to connect the computer to the TV and run [email protected] instead of [email protected], which should be easier on the eyes. This board has a solid VRM, and reliability is very important for a production machine.
Storage Drive (SSD)
SSDs these days are very fast and very reliable. I went with two drives:
Honestly, if you don’t work with large files, you won’t notice the speed difference at all in your daily work. Only if you edit gigantic image / video files is it worth shelling the extra money for a pricey SSD.
Both drives are great so far. You can’t go wrong with either.
I installed the Arch Linux distribution on one drive, and Debian (Stable) on the other. I use Debian as the main environment, till I master the learning curve of Arch.
Power Supply (PSU)
As I said, power delivery and stability is very important for a production machine. I want my components as safe as possible, especially in times where the weather is bad and electricity becomes unstable.
I chose to go with the Silverstone SX700-LPT. Reasons: It’s a small form-factor PSU, meaning it won’t take too much space, and if I ever want to move my computer into a small case, it would work. Second, it’s modular, meaning I don’t have to have all those cables dangling in the air. Only what I need is connected. Third, it’s rated 80 PLUS Platinum. That means great efficiency. Fourth, it’s SILENT. Not only because SFX-L comes with a 120mm fan (quieter than the standard SFX’s 92mm), but also because this particular power supply is very strong for what my computer needs, even in full load. The result? It doesn’t even spin it’s fans! Yes, the power supply is 100% silent all the time. The fan isn’t even needed. I wanted the build to be as quiet as possible.
Doesn’t get any quieter than this.
I used to like Mac OS, coming from Windows. It’s a good-looking, stable operating system that works out of the box. It felt much less intrusive or buggy than Windows. However, recently it started to annoy me. I hate the enforced bundling of Apple software on my Mac. I don’t need most of it, and I don’t want to see it when I’m navigating my computer. It takes away energy from my brain. My computer is not a toy – I don’t need Apple Music, or Siri, or whatever. It’s my working tool.
I wanted something secure, clean and free of the distractions that Windows or Mac OS comes with. I dislike Windows – too intrusive, too buggy, too much messing with security, updates and “Are you sure?” screens. Yes, I am sure.
I liked Mac much better. It’s beautiful, intuitive, and secure (compared to Windows). But for the love of God, Apple, give your users some freedom. It is full of pre-installed junk that I don’t need, many features that I’d like to delete but can’t. I don’t need Mail, Reminders, Messages, PhotoBooth, Music, TV, Voice Memos, Stocks, Books, Home, Siri, Chess, Stickies, Image Capture, Automator, Grapher, etc. Why can’t I delete all that? It just distracts me.
And I absolutely hate it.
I want to control the operating system that I use, not to be controlled by it. It is demoralizing to live in a house that you cannot rearrange to suit your needs. I feel like I could do better work if I liked the environment that I spent most of my day in. Therefore, GNU/Linux, where you are free to customize (and minimalize!) everything you want.
Linux is the only option for me, especially as someone who only needs a browser and a writing software. Unlike on Mac or Windows, there’s not a lot to get lost it, not a lot to distract me from my work. I didn’t realize how much time I spent navigating on Mac / Windows until I started using Linux. The abundance of extra steps needed to navigate or run an application were truly astounding. On Linux, there are just files, applications, and the terminal.
I’m currently toying with Arch + i3-gaps (a tilling window manager), trying to get used to it and make it work. Meanwhile, I have the stable Debian running on the second hard drive, in case I want to boot from something easier to work with.
For writing, I downloaded Ghostwriter. To replace Mac’s Preview app, I installed Xournal, which has the simple annotation tools I need to add dates, text, and my signature. It works exactly as needed. To manage my images, I use the simple Shotwell application.
Now that I made the switch to an open-source, free (as in freedom!) OS like GNU/Linux, I can’t see myself ever going back to the bloated, proprietary ones I used before – Mac OS or Windows.
Ok, enough for now.
Time to be productive.