Blog

Goals for 2019

December 29, 2018

It's that time of the year again, where we look back on what we've accomplished for the current year and set goals for the new year.

Overall, I say 2018 had been another good, low-stress year for me. I didn't run into any major problems and things had gone quite well especially on the business side of things.

It did feel like 2018 just flew by. It feels like I wrote the ...

Flameshot: An awesome screenshot capture tool for Linux

December 24, 2018

When it comes to screen capture and other media editing tools, Linux really falls behind in terms of good options out there compared to Windows and Mac.

For a long time I've been using the default "Screenshot" tool that comes with Ubuntu. While it works fine, it does require quite a bit of additional steps if you need to annotate the image as you have to save it first then use another tool to ...

Celebrating the Small Wins

November 5, 2018

Our tiny, two-person bootstrapped software company, Highview Apps, has recently passed $100k in total revenue. This is certainly a big milestone for us and took almost 2 years to get to this point. For some reason, it didn't feel that long and felt like we actually got here fairly quickly. Looking back, I realize that part of the reason is because we celebrated the small wins, which kept us going and motivated.

When you ...

How to resume download over SSH using "rsync"

October 30, 2018

This is one of those "note to self" again.

I needed to download big file (close to 6GB) from a remote server last night but the Internet connection at my AirBnB was a bit spotty and would drop periodically.

Normally, I'd just use scp and even 6GB of data is no big deal on a fast and stable connection. But with a slow or unstable connection, I needed a way to resume the download ...

How to use "nohup" with chained commands in Ubuntu

October 30, 2018

This is another note to self as I keep forgetting the syntax.

Basically I needed to use nohup (short for "no hangup") to execute a PostgreSQL database dump while connected to a remote server via SSH. The reason is to make sure the command doesn't get terminated in case I get disconnected from the server as the command could take some time to complete due to the size of the database I'm backing ...