NextCloud – Improve Images preview and reduce CPU load

Here’s a quick guide on how to improve images preview and reduce CPU usage of NextCloud while browsing your directories containing images.

First you’ll need to install the following NextCloud App: Preview Generator

Then you’ll have to run the following “occ” command in order to optimize the size of the generated thumbnails:

php ./occ config:app:set previewgenerator squareSizes --value="32 256"
php ./occ config:app:set previewgenerator widthSizes  --value="256 384"
php ./occ config:app:set previewgenerator heightSizes --value="256"
php ./occ config:system:set preview_max_x --value 2048
php ./occ config:system:set preview_max_y --value 2048
php ./occ config:system:set jpeg_quality --value 60
php ./occ config:app:set preview jpeg_quality --value="60"

Once the Preview Generator app has been installed and activated, execute the following “ooc” command:

php ./occ preview:generate-all

This will scan all of your files and will genarate thumbnails. You can add the above command to your nextcloud user’s cron to be executed every 10 minutes eg:

*/10 * * * * php /path/to/your/nextcloud/occ preview:generate-all

Enforce SSH Connections Alive

Here are a couple of option you have in order to keep the SSH connection alive

$ ssh -o ServerAliveInterval=60 <user>@<ip>

The above will instruct the ‘ssh’ client to sent “alive” packets every 60 seconds.

The following will set the same as default (by adding the option to your ssh client’s configuration file) for each ssh connection.

$echo -e "Host *\n\tServerAliveInterval 60" >> $HOME/.ssh/config

Traccar – fix “The write format 1 is smaller than the supported format 2”

systemctl stop traccar
mkdir /opt/FIXH2
cd /opt/FIXH2
wget https://h2database.com/h2-2019-10-14.zip
wget https://github.com/h2database/h2database/releases/download/version-2.0.206/h2-2022-01-04.zip
unzip h2-2019-10-14.zip
mv h2 ./h2.2019
unzip h2-2022-01-04.zip
mv h2 ./h2.2022
find ./ -name "*.jar"
cp -p /opt/traccar.4.13/data/database.mv.db /opt/FIXH2
java -cp ./h2.2019/bin/h2-1.4.200.jar org.h2.tools.Script -url jdbc:h2:./database -user sa -script backup.zip -options compression zip
java -cp ./h2.2022/bin/h2-2.0.206.jar org.h2.tools.RunScript -url jdbc:h2:./database_new -user sa -script backup.zip -options compression zip
cp ./database_new.mv.db /opt/traccar/data/database.mv.db
systemctl start traccar

HOWTO extract Nginx logs for the past hour/s

Here’s a quick and handy “awk” snippet to extract data from Nginx’s access or error log file for the past hour/s.

# awk -v d1="$(date --date '-60 min' '+%d/%b/%Y:%T')" '{gsub(/^[\[\t]+/, "", $4);}; $4 > d1' /var/log/nginx/access.log

This example shows how to extract data from /var/log/nginx/access.log for the past 60 minutes – ‘-60 min’.

HOWTO exclude specific packages from being updated via yum or dnf

Here are a few examples on how to exclude some packages from being updated during yum/dnf update:

# yum update --exclude=PACKAGENAME 

Exclude all kernel related packages during update:

# yum update --exclude=kernel*

Exclude gcc and java:

# yum update --exclude=gcc,java

Exclude all gcc and php related packages:

# yum update --exclude=gcc* --exclude=php*

In order to permanently exclude/disable updating of some specific packages you might want to update your dnf.conf or yum.conf eg:

[main]
cachedir=/var/cache/yum/$basearch/$releasever
keepcache=0
debuglevel=2
logfile=/var/log/yum.log
exclude=kernel* php*             <---