There are a lot of background images on the same page in my website, the download is too big, and I need to slim down, this cwebp tool is good.
In the environment of Ubunut, apt install webp can be installed.
About webp transcoding, there are detailed instructions in Chinese on this website, you can refer to:
https://developers.google.com/speed/webp/docs/cwebp
< mark class="marker-green">Not all browsers support webp, such as safari in early MacOS, maybe you want to transcode with jpeg.
Taking Ubuntu as an example, you can install the following two packages in the Ubuntu environment.
I installed it in a Docker environment, a simple Docker environment with my own architecture phpenv.
sudo apt install imagemagick webp
Completed the above installation instructions, remember to get three props:
< pre>cwebp
identify
convert
With the command, this bash is born, this bash can be divided into three parts.
Before you start, please note that the bash I wrote is extremely violent and will not leave the original image.
If you want to keep the original image, please back it up and test it yourself!!< /strong>
#!/bin/bash
# Grab all .jpg files in the directory
for o in $(ls *.jpg)
do
filename="${o%.*}"
#The same file name webp is not there, optimize and generate a webp file
if [ ! -f "$filename. webp" ]; then
cwebp -q 60 -resize 2048 0 $o -o "$filename.webp"
# Get the transcoded width and height
WxH=$(identify ${filename}.webp|awk '{print $3}')
#Add the original jpeg prefix origin_
mv $o origin_$o
convert origin_$o -resize ${WxH} $o
#Remove the original jpeg, please comment if you want to keep it
rm origin_$o
the fi
done
#jpeg is larger than 600k files, but it is processed when there is webp
# means it has been transcoded, it may be that the jpg source is overwritten by a large file, to prevent repeated downloads
SIZE=600k
for o in $(find *.jpg -type f -size +${SIZE})
do
filename="${o%.*}"
if [ -f "$filename. webp" ]; then
#First use the original length and width for jpg optimization
WxH=$(identify ${filename}.webp|awk '{print $3}')
mv $o origin_$o
convert origin_$o -resize ${WxH} $o
the fi
done
#Compulsory optimization, the above optimization is invalid, and when the JPG is still larger than 600K, a big trick
#Directly reduce the image width to 1364, and then transcode
for o in $(find *.jpg -type f -size +${SIZE})
do
filename="${o%.*}"
cwebp -q 60 -resize 1364 0 $o -o "$filename.webp"
# Get the transcoded width and height
WxH=$(identify ${filename}.webp|awk '{print $3}')
#Add the original jpeg prefix origin_
mv $o origin_$o
convert origin_$o -resize ${WxH} $o
#Remove the original jpeg, please comment if you want to keep it
rm origin_$o
done
The first loop: Normal processing, capture files with jpeg but no webp, and shrink the image file.
Catch all jpg files in the directory, take out the file name, and check whether there is a webp file with the same file name (because it has been transferred with a watch)
When there is no webp file with the same file name , use cwebp for transcoding.
After the conversion, I use the identify command to grab the length and width of the webp image file.
Because the convert command cannot specify the width and height ratio adjustment like webp,
so the transcoded webp length and width are obtained through identify, and directly used for convert to reduce and optimize the jpg file.
The second loop: repeated processing, when there are jpeg and webp, but the jpeg file capacity is greater than 600k (classification reason: the jpeg file is overwritten by the original large file or the first paragraph The process really cannot be reduced, and it rarely happens).
The anti-optimized jpg may be downloaded by me again, so it will be re-transcoded,
Therefore, here are the JPG files exceeding 600k, and there are also webp files with the same file name.
Then use the length and width of webp to re-optimize the jpg again.
The third loop: Destroy the sky and reduce the earth, here I have a big trick, the width of the image file is reduced again, and the webp and jpeg are both reduced in width by an indiscriminate attack.
It’s just that some jpg pictures are very bad, and they are still very large after optimization, exceeding 600k, so I will make a big deal in the third paragraph.
Use the find command to grab a jpg image larger than 600k, directly reduce the width of the image file, and then optimize it.
Theoretically, you can back up the webp first, and then restore the webp after adjusting the jpg,
But yes, I have such a violent solution, directly reducing the width of the webp together without restoring it.
If you use my bash and have more than 600k files to see and identify by yourself, let’s do it manually🤣
No Comment
Post your comment