I have reemerged

After a very long time, I mean, a very long time, I have reemerged.

What shook my unscathed slumber for years?

I’ll tell you what. I also don’t know. After a very long hiatus, which none of you are aware of, because potentially you might have only stumbled upon this desolate land for the first time, thus not knowing a thing about what I am talking about, because even a post about it did not even exist, I just decided to post a post in this blog.

What you may not realize, is that this post is actually a test of something. I don’t know, maybe it’s just a post to test how reliable my English skills is after all these years after coming to a country that doesn’t speak English and shaped the language into an indescribable pile of a mess that they love to use because their society decided so.

And because of that, due to the natural instinct of our human mind (provided that it’s a normal one) to link things associated to whatever thing we stumble at the moment, I might create posts that pertains, or if you put it adequately, relates to a popular language that the country I currently reside uses intensively, or rather, exclusively.

And I think you know what language I’m talking about which, by the way, I will immediately debunk by saying that it is actually false, because there’s no way you’ll know what I’m talking about. It’s actually about one famously (notoriously?) known “difficult” language, with which various forms of media in that language actually spread throughout the world and helped shape/influence various industries, specifically, technology and art/entertainment industries, to what they are today. And you might know the reason why I wasn’t able to post a single damn thing for a long time aside from me being lazy and shamingly undetermined to the path I declared that I chose even though no one actually cares.

There will be posts in the near future (that’s a relative word) that will undermine those topics, and will help me organize my thoughts as I myself, seem to have stumbled upon a wide plateau that seem to cut the passage through that mountain I want to climb. Everyone around me said that I already am fluent and very good at the language but I refuse to acknowledge that. Especially since it ain’t my native tongue. As it is historically a very deep, sophisticated, dynamic and contextual language. There is still a lot to learn.

There is a saying (that I invented) that says one should not be content with the knowledge that he has, as there is only one constant thing, and that is change learning.

And I will give you a clue; it’s in Japanese.

omg i gave a clue i feel so smart


Resize/convert all images in a folder just by using batch script and FFmpeg

Do you ever had trouble with converting thousands of existing photos, e.g. from PNG to JPEG? Can’t find a neat GUI program to actually do this job painlessly, without browsing through a slew of options? Do you actually want to waste time to make a complicated batch script just to do simple jobs to satisfy your lazy constructive brain, like the way I did?

While I’m sure there are programs that does exactly that out there in the woods, this very simple procedure would also do the same job, in a small code, in so much lesser size.

Assuming you didn’t accidentally stumbled on this simple, unoriginal, and messy post out of nowhere and you had genuine interests as a computer savvy, you can continue reading this post. Actually, no one can stop you anyway.

Assuming that you’re also using one of the modern, very popular OS that is Linux that is Windows, it’s just a code away using Notepad. We’ll gonna make a simple batch script.

The batch script

Batch scripts are very notorious, aside from being a very high-level language (it’s a script interpreted by an interpreted interpreter… sort of), considerable amount of hackish code and workarounds are used just to do a freaking simple job, that a low-level language can easily do, that even Bash has native support. Basic example is the sleep command (used to pause the program for a definite amount of time), which Batch (or cmd) mysteriously doesn’t have, instead you’d rely on ping which is a network diagnostic tool. Most of this is basically a consequence of the nature of its syntax. It was like it was passed along and played by many engineers that does seem to have their own fetishes, err, version of what kind of creature it should be, like a baitch (sorry).

Compared to Bash, the Unix shell equivalent, Batch is frustrating to some extent. In spite of this shortcoming, Batch is still convenient and useful as a script, when you know the way around it. Heck, you can even do a routine that do complicated task, like syncing and patching files and binaries through a server while using queue techniques, just by using batch script (Of course, in conjunction with real native programs). I know, I’ve done this. Source? Nah, you think you can get free shiny codes? Kidding.

Reading this post up to this point, you must have a bit of a background with the things and concepts. If you believe that

@echo off

disables echoing of sound through your sound device, you will have difficult time just understanding the syntax of this code.

Ignoring what I said above and moving on, laying out the fundamentals:

set variable=value
echo %variable%

Knowing these fundamentals will give you much more difficult time in understanding the code that I will provide here. Trust me.

Next level, laying out some example ‘standard’ procedures:

setlocal enabledelayedexpansion

for /f "usebackq skip=2 tokens=1,* delims=:" %%G in ("!temp_folder!\!arrayfile!\temp_") do ...
for /f %%n in ('cmd /q /u /c "echo(%data3%"^|find /c ":"') do ...

echo( >NUL
call set /p variable=<%%~arg:~!_array!,1%% 2>NUL

If you’re a batch person, you’re gonna understand this immediately like a native Latin speaker would to Latin. For an unsuspecting programmer that goes straight to writing codes and whatnots without care in the underlying OS, they can scratch their head.

Wonderful, isn’t it?

Giving you a heads up and a valid set of expectations, we now head to the star of the event, the code itself. (Why did it take so long?)


@echo off

set argument=%1
if defined argument (
  pushd "%~1"
) else (
  pushd "%~dp0"

set "bname=%~n0"
set "resolution=720"

echo %bname% | find "480"
if %ERRORLEVEL% EQU 0 set "resolution=480"

echo %bname% | find "720"
if %ERRORLEVEL% EQU 0 set "resolution=720"

echo %bname% | find "1080"
if %ERRORLEVEL% EQU 0 set "resolution=1080"

if not exist output mkdir output

for /f "tokens=* delims=" %%G in ('dir /A-D /B *.jpg') do (
  ffmpeg -i "%%~G" -vf scale=-1:%resolution% "output\%%~nG_resized%resolution%.jpg"

for /f "tokens=* delims=" %%G in ('dir /A-D /B *.png') do (
  ffmpeg -i "%%~G" -vf scale=-1:%resolution% "output\%%~nG_resized%resolution%.png"


Looks so simple, yet complicated enough, in a small code, isn’t it?

If you immediately (or even barely) understood the procedures above, congratulations! If you didn’t, don’t fret! This is actually a very simple batch script compared to those insane ones (I made one myself, well I considered it as one…). This actually just resizes your JPEG/PNG images.

We’re gonna break it down to parts so you can know what parts to edit to manipulate the program to the way you needed, but before that, did you notice the line ffmpeg in the code? We have to get and set that up before this program would even work!

Head over first to https://ffmpeg.org/download.html. Now click the most obvious icon there (I trust you to have a beautiful, imaginative mind.) No, not the download button! The Windows icon! (So much for obvious!) Then click its corresponding link ‘Windows Builds’. It will take you to another page. Now, get either 32-bit or 64-bit static (if you’re in a 64-bit machine). My estimated size for FFmpeg 32-bit is around 12mb (64-bit is larger). I thought I said less size! Well, it is still lightweight for me. After you downloaded it, extract the contents and specifically isolate ffmpeg.exe. Copy it somewhere; any folder will do.

Now, in a related note, you can actually use FFmpeg to convert, encode, re-encode, extract, apply filter, anything that you can do to manipulate your media, be it video, audio or images. It is a little bit manual, but it is part of its charm. You can build automated scripts and macros to precisely do specific jobs. Fine engineering, at best.

Now with your FFmpeg ready, copy the code above and paste it in Notepad. After pasting it in Notepad, save it as a batch file with the extension ‘.bat’ (e.g. “imageresize.bat”), on the directory/folder where you placed ffmpeg.exe previously, and… viola! It is now operational and ready to go. Now you may have got the gist of how things work. You can also use the PATH environment variable to include the folder where ffmpeg.exe is located, or place the ffmpeg.exe directly inside your Windows’ system32 folder to give more freedom to the location of the batch file (you can separate them this way).

To test if it works, from the explorer, drag the folder where the images to convert are contained, and drop it on the .bat file you just created. If successful, ffmpeg.exe will run (as evidenced in the console output), converts your images one by one until all are finished, and it will exit. You will find the output inside the \output folder in the folder of the images you just converted. If all is well, all output files will be present there. If not, well, there must be something wrong, which is beyond the topic of this post (sorry).

Without worrying on what each line does exactly what, we proceed to the code breakdown. No, I haven’t forgotten this one. No, I won’t explain each command one by one as this was intended to be a short post. Look what happened!

Code Breakdown

@echo off

set argument=%1
if defined argument (
  pushd "%~1"
) else (
  pushd "%~dp0"

The segment above sets the initial working directory based on whether we supplied an argument to the batch script or not. By dragging and dropping a folder to the batch file, we essentially provided the Explorer an argument to use when running the batch file.

There are many ways to supply an argument to a batch script, like providing it through a shortcut, or running it through cmd.exe with an argument. When there is no argument supplied (essentially, just executing the batch file directly), the working directory defaults back to the current directory where the batch script is located.

So now, there are two ways to use the batch script!

  • Placing the batch script alongside the folder that contains the images to process, and
  • By dragging and dropping the folder itself to the batch script, wherever it is located.

Now, to set what output resolution to use:

set "bname=%~n0"
set "resolution=720"

echo %bname% | find "480"
if %ERRORLEVEL% EQU 0 set "resolution=480"

echo %bname% | find "720"
if %ERRORLEVEL% EQU 0 set "resolution=720"

echo %bname% | find "1080"
if %ERRORLEVEL% EQU 0 set "resolution=1080"

Now, this is a neat old trick. Instead of manually setting the resolution of the output image directly inside the batch script everytime, I opted to set it according to the filename of the batch script itself. If you renamed your batch file from ‘imageresize.bat’ to something like ‘imageresize480.bat’, instead of the default 720 pixels that I set there, the batch file will check the filename first, and since it contains ‘480’, it will now set the resolution variable to 480. Now, if I change the filename to ‘imageresize4807201080.bat’…

if not exist output mkdir output

for /f "tokens=* delims=" %%G in ('dir /A-D /B *.jpg') do (
  ffmpeg -i "%%~G" -vf scale=-1:%resolution% "output\%%~nG_resized%resolution%.jpg"

for /f "tokens=* delims=" %%G in ('dir /A-D /B *.png') do (
  ffmpeg -i "%%~G" -vf scale=-1:%resolution% "output\%%~nG_resized%resolution%.png"

This is the main segment that does the legwork. I could have used nested for loops and use a variable that containes file extension names to recursively loop through the data, but I figured that it is a bit overkill for our simple, mundane task. Though it could have been extensible and would work brilliantly.

Now, if you know how to use FFmpeg (you can refer to its documentation for its usage), you can of course modify the ffmpeg lines to your needs. This currently resizes the image to a resolution set from the previous code block, preserving its aspect ratio and format. You want to convert it to another format? You can change the extension of the output file. E.g. :

ffmpeg -i "%%~G" "output\%%~nG.bmp"

That’s it! We need to wrap up this abomination of a post now, quick. There are so many possibilities with FFmpeg alone. Now you got it, your ultimate image converter/resizer using FFmpeg!


In also a related note, assuming that you are a computer wizard/hobbyist/elite/shut-in, you can also build a custom FFmpeg with custom encoders from scratch using Cygwin, a convenient linux shell for Windows. Your ‘nix geekness will really shine there.

Hope this long post helps!

An update post, yay!

Thank you for being a patron of this website, I mean, for your first visit on this website. I mean, for your accidental visit on this website. You must have found a depressing part of the deep internet (ha).

This year was a very difficult one, emotionally, ideologically, economically, however you put it (it doesn’t concern you, anyway). Though, this update post (you can’t even begin to call this a post…), however insignificant and useless its contents are, might give me (and you) a fickle of light and hope to continue moving forward in this life, by being a post itself.

A post, whatever it is, however it came by, marks a progress; a stepping stone, I dare say, if we are being the figurative guys. So in the desperate attempt to maintain this disposal dump of a site, I had the necessity to write this post. A progress, by being progress itself, is arguably always the fundamental component of hope. Yep, you heard (or rather, saw) it first here.

A bit of history. Only a couple of posts below this one is a Happy New Year post. Incidentally, the next New Year is drawing near. That is very awe-inspiring, if I think about it. I’m dead serious.

Having been very busy figuring out where the heck is the next useless checkpoint in the game of the millennium Real Life™ (you can’t revive back to a checkpoint) and still not figuring it out, I conclude that writing a post about future plans for your life and the future itself does nothing good and might just actually jinx your actual conceived future plans.

Instead of giving you an illusionary stepping stone to enact the said plans, you might have as well decided that they will not happen. It’s kinda the illegitimate sister of knowing a predetermined future; knowing that said future will inevitably happen, you gracefully embrace it and do nothing in an attempt to change it, in the comfort of inability. While that is totally fecked up, the same thing (“fecked up”) can be said to writing and listing down your future plans, and then not becoming a reality.

Now, you may ask and say, ‘What’s with this update post? It is clearly a nonse… err, sensible post for some sort of future plans…’ I’ve got to tell you, that you’re wrong. This is a ‘totally random post, made mostly of itchy fabrics of mental deductions of daily philosophical ideals that my brain conceived, thus it must be dumped’ kind of post. Any future plans written in this post is clearly for the amusement of the idea that this post beget.

tl;dr: There will be upcoming posts.

Now, that’s a good opportunity to be jinxed.

ChapterMerger Beta

While trying to play my MKV files on a “Smart” TV (It’s just a low powered Android. LOL), I am struck by a very common problem with media boxes. The video seems to be shorter. Then, I’ve found out that it actually is missing a part, in this case, the opening sequence and the ending sequence (Some of you may be very familiar with this setup.). When I tried to play it in my workstation (fancy term for a good ‘ol desktop box), it plays fine, complete with the missing part! Acting like I didn’t know what the heck actually happened, I actually search for this phenomon in the Internet, and behold — the one thing that made MKV greater among its circle — segment linking.

So I’ve been wondering, “How can I make this file playable on my ‘Smart’ TV?” “How was it able to look for the correct file?” I took liberty of tinkering my MKV files, and I’ve found out that it’s able to do through the data provided by its chapter file. So this is where the name “Ordered Chapter” originated from. Luckily, this chapter is just a form of GUID-like hexadecimal code. Sounds like an an alien artifact? Yeah, I know. This SUID represents every MKV file and is used to locate the MKV file represented by the SUID the chapter track provided in the chapter file embedded in the MKV. And media players (that uses the standard MKV library) only finds this in the current directory the file resides. When the file is found, the media play inserts this file into the current stream as if it is part of the file being played. Hence, the segment linking name.

Now, the theory is apparently easy. In order for the file to appear “complete” on my media box, The great thing about MKV compared with others (like MP4) is that it is a very flexible container that you do not need to re-encode the feces of it just to add/remove media tracks, parts, and whatnots. So I can just merge the linked MKV file to the original MKV file, using a number of tools for this purpose. One of the most popular and successful tools in this realm is MKVToolNix – also available in Windows. It’s not just merging the file, you also need to readjust the time codes in the original MKV’s chapter file, or you can remove the chapter file completely. Some people call this “unordering“. By the purpose of this article, let’s use that term.

I am successful in doing that, but now I found myself a new problem — since I’m lazy, my brain decided — I can’t just do this manually (the process takes approximately 30 minutes every file, damn it) for every file, all with the finding the correct time code to insert the externally linked file, to splitting the entire MKV file. Especially if it’s a TV series we’re dealing with (12 – 24 episodes, and sometimes 50+). Out of this frustration, I eventually ended up building a C# application to carry all the tasks I needed to do, complete with GUI (Although I could just spend the time I took building this application on actually merging the files manually by myself).

ChapterMerger Beta


This program is very, very simple. And it needs equally simple prerequisites in order to run:

For starters, it is just a mere wrapper for the great MKVToolNix. I am not yet that low as a programmer (Low as in low-level programming, got it? Forgive me.). This program must be run besides the MKVToolNix tools inside mkvtoolnix folder, or it can be run anywhere provided that mkvtoolnix is included in the PATH variable.

Now, you were able to run it, you are presented by a really simple interface like the above. Reading what I describe about “unordering” the MKV file, you can get a gist of how to operate this little program. You just have to add the MKV files, or the folder containing them. Make sure the segment linked MKV file comes along (99.9% of the time, these files come together). Then click ‘Analyze’. After analyze is complete, you can now click ‘Merge!’. This outputs the new MKV file in an ‘output’ folder, inside the folder containing the MKV files. Play it, and you now have an “Unordered” MKV file!

You can also organize and manage project files that contains information, like the files/folders to process, its configuration, and its analyze data (optional). Just use the appropriately named ‘Save Project’ and ‘Open Project’

This is pretty useful (as I say confidently) on Anime TV-series-related releases, and even though, not a lot of Fansubbing groups does this today (compared to the time when segment linking was first popularized by the Anime scene), you’ll find it useful in older releases that you might have, as well as today’s.

So, if you have done what I have done, and you’ve been doing it, I hope this program can help you as it helped me.

The program is quite very early in its stage of life, and it doesn’t deserve an alpha tag (it’s barely functional), so it’s in the Beta stage now. I will update this program as time progresses as long as it allows me, so stay tuned!

Currently for Windows. I may add MONO support later on for cross-platform support.

You can grab the program via github — https://github.com/koohiikappu/chapter-merger/releases/.

You have a github? You can add me! Let’s follow each other!

Beware, it’s still in beta and may have bugs and unexpected things.

Very Late Happy New Year Everyone!

Yes, very late, indeed. But better late than never, no?

But in actuality, I do not have the need to post this post with the convenient title. I posted it, anyway. Why? Is it because it’s habitual? A trend? A standard in this society/culture? Or is it because I do mean it?

Why do people do this every year? Why waste money, time, personnel, gun powder, paper? Do they do it with the same reason every year? Or is it because they are swayed by other people? That’s the thing about being a social being; we can’t help it. Whatever the reason perceived in doing it. Even though there is no point of doing it. Even if we do not mean it.

Honestly, I do not mean it also. I do not mean a happy new year for everyone. Heck, I do not even know the anonymous you. I don’t even think someone would visit this site on purpose. I just posted it because that’s what everyone does; you feel the need to do something because people around you did. That’s the power of social interaction. That, my friend, is being human.

When you don’t feel posting “happy new year” and you cannot think for this year to be a happy one, because of reasons like your life sucks™, or you’re just an old plain cynic, just think about this: we won’t be able to get at this point of human evolution if we’re not being human, if people don’t get to make examples from/for other people. This is seen on all fields and places humans go: art, technology, cities, cultures, etc. All of us revolve around human-acknowledged principles, like emotions, and it’s part of every work humanity have done in this cruel universe, however insignificant we are. We can’t be human without norms and morality. And giving and getting a “happy new year” doesn’t hurt.

So, happy new year to you guys.

Amidst all this “long” post, the truth is, I just got back from playing Real Life™ (the game has realistic graphics and damage parameters) and I got nothing to post so I posted this instead.

Compiling FFmpeg with libx264 and libfdk_aac under Cygwin without Mingw32

After many days and nights – I have been through many battles. One battle involved my scratching of my head many times. This battle is me trying to build FFmpeg with Cygwin. A new post and about computer stuff? Yay!

If you just happen to stumble upon this mostly irrelevant post to the world, and you know absolutely nothing about programs and have no interest on these kind of things, you can opt out and stop reading right now, or better yet read some of my other post, as if there’s many worthwhile one, or just go outside and do some “real world” stuff instead of being in front of your computer lazily glazing your eyes with artificial radiation. What the feck is an artificial radiation anyway. If someone referred you to this page, like a search engine, then I’m fluttered.

Continuing with the topic. This is about making the latest source snapshot at least compile within Cygwin under Windows. If you didn’t know about Cygwin, it’s an excellent (at least for me) unix shell layer for Windows. This can entice you if you miss bashing (sorry) and you want to experience it under Windows or if you want to use practical UNIX tools under Windows.

The problem with Cygwin (a system always have a problem; it’s the charm of engineering) is because it’s practically an emulation of the Unix shell (even though technically it’s not an emulation; it’s more of an application layer, like Wine on Linux), and you are guaranteed to encounter a slew of problems here and there, especially if it pertains to libraries and dependencies.

You can think you can get away with the base installation of Cygwin and just compile those dependencies yourself, but you’ll just waste your time. Because of the low-levelness of these things, they are specially pre-built by the Cygwin team and are available over the repositories, and can be fetch by the setup program. So we will use the pre-built version of these little ones. However, some are missing on the repositories; most likely due to licensing issues. So to get around this, we can try to build it ourselves and cross your finger, or find a third-party repository. We will go on the harder route, and we’ll build them ourselves coz we’re masochists Fortunately, there’s one available. We will come to that later.

And you might have thought, what the ♥♥♥♥ is FFmpeg? FFmpeg is the most powerful, open-source command-line media factory for all your media handling needs. Hands down. You can use it to efficiently convert, repack, encode, re-encode, etc. almost all various image, video, and audio formats. It is the back-end that powers most powerful, often free, encoders, decoders, and players out in the wild, like MPlayer. Later versions also includes a player and a server, ffplayer and ffserver, respectively, but mainly it is used for repackaging and encoding purposes. We will only focus on the ffmpeg binary on this page.

For this guide, we will use another guide (so much for a guide). Head over to https://www.ffmpeg.org/platform.html and read the specifications “Compilation under Cygwin”, or you can read them all and have an idea of how these things work across platforms, or just ignore the link and continue below (trust me, the above link is completely optional). It mutters these names:

binutils, gcc4-core, make, git, mingw-runtime, texinfo
bc, diffutils
libogg-devel, libvorbis-devel
yasm, libSDL-devel, libfaac-devel, libaacplus-devel, libgsm-devel, libmp3lame-devel,
libschroedinger1.0-devel, speex-devel, libtheora-devel, libxvidcore-devel

Install them through the Cygwin setup, together with the entire “Base” packages. gcc4-core is gcc-core. git will not be used but install it anyway (for the future). bc and diffutils are still important even if you don’t do FATE so get them anyway. mingw-runtime is required even though we will not use Mingw32 for building. The 4th row is the most important, but all of these are not available in the official repos (you know the reason). In order to get them, follow the guide there and use Cygport repositories. Make sure you get yasm and libtheora, as those others provides the codecs, albeit popular codecs, so you might get them as well. Since we will build libfdk_aac and libx264 ourselves, we can ignore libfaac, libaacplus, and libvorbis (they’re missing in the repositories anyway). You can also ignore libschroedinger, since we will ideally use x264 for h.264 codec.

Additionally, install the packages:

wget, unzip, texi2html, curl, dos2unix, autotools (autobuild, autoconf, automake, autotoolset, all versions, for insurance), pthreads, cygport

dos2unix is required for conversions of formatting characters (e.g. newline character) from dos unto Unix format. Make sure you get cygport, it provides the proper lib-utils to build under windows. Without it, all of your attempts to build will throw you an error. autotools is required for one particular build process. pthread is required to build ffmpeg with pthreads, as it’s supposed to provide multithreading support. You can build without one; in fact, it’s not required if you’ll only use x264 codings, as libx264 provides this functionality for itself.

After you managed to get and install those packages, we will use another guide, again. We will use https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu. You can read that entire page, or just ignore the link and open your Cygwin Terminal. For the purpose of uniformity, we will follow the guide (or you can just do some liberties in the process, as long as it sounds logically). On the terminal, run:

mkdir bin && mkdir ffmpeg_sources && mkdir ffmpeg_build

Those will setup the working directories. After that, copy and paste to terminal:

cd ~/ffmpeg_sources
wget http://download.videolan.org/pub/x264/snapshots/last_x264.tar.bz2
tar xjvf last_x264.tar.bz2
cd x264-snapshot*
PATH="$HOME/bin:$PATH" ./configure --prefix="$HOME/ffmpeg_build" --bindir="$HOME/bin" --enable-static
PATH="$HOME/bin:$PATH" make
make install
make distclean

Make sure to press enter on the last command. This will build libx264 for us. If all required packages were installed, the build process will be successful. You can find libx264.exe inside ~bin on your cygwin home folder.

This builds the 8-bit variant. If you happen to want to encode 10-bit videos, you have to use --bit-depth=10 argument when running configure. So thus the command:

./configure --prefix="$HOME/ffmpeg_build" --bindir="$HOME/bin" --enable-static --bit-depth=10

If you’ve come across --enable-win32thread, you’ll know the name is sweet but don’t ever use it!! (Not that you care.) You’ll lose multi-threading capability since we are not cross-compiling for windows; we are compiling for Cygwin.

All done with libx264, run:

cd ~/ffmpeg_sources
wget -O fdk-aac.zip https://github.com/mstorsjo/fdk-aac/zipball/master
unzip fdk-aac.zip
cd mstorsjo-fdk-aac*
autoreconf -fiv
./configure --prefix="$HOME/ffmpeg_build" --disable-shared
make install
make distclean

This will build libfdk_aac. The “F” stands for Fraugher, something like that (it was German, ha ha). Again, if all packages mentioned in this post were installed, this will build successfully. It doesn’t come with an executable binary so it doesn’t produce an output inside ~bin. Now, we will have a library that will enable us to encode High Efficiency Advanced Audio Codec (HE-AAC). Quality audio for lower bitrates. That’s German engineering for you. Yay!

Now, finally, the ffmpeg itself. Run:

cd ~/ffmpeg_sources
wget http://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2
tar xjvf ffmpeg-snapshot.tar.bz2
cd ffmpeg
PATH="$HOME/bin:$PATH" PKG_CONFIG_PATH="$HOME/ffmpeg_build/lib/pkgconfig" ./configure \
  --prefix="$HOME/ffmpeg_build" \
  --extra-cflags="-I$HOME/ffmpeg_build/include" \
  --extra-ldflags="-L$HOME/ffmpeg_build/lib" \
  --bindir="$HOME/bin" \
  --disable-ffplay \
  --disable-ffprobe \
  --disable-ffserver \
  --enable-gpl \
  --enable-libfdk-aac \
  --enable-libmp3lame \
  --enable-libtheora \
  --enable-libvorbis \
  --enable-libx264 \
  --enable-pthreads \
PATH="$HOME/bin:$PATH" make
make install
make distclean
hash -r

This is my configuration. Be ready for a loooong time of compilation. If you didn’t install libvorbis, you can remove its line. The same with other codecs that you didn’t install. We disabled ffplay, ffprobe and ffserver for this setup since we didn’t install some of their dependencies. If you want them, you can always head to the official FFmpeg documentations and wiki for more information on them and their dependencies.

Now, unless you find watching progress and numbers appearing entertaining (like me), that’s a boring compilation! You will find ffmpeg.exe inside ~bin. If you care about bits, we used x86 toolchains, so the binary is 32-bit. You’ll notice that it’s pretty big compared to other ffmpeg builds out there in the jungle, but don’t fret. It was built statically which means that included libraries are all packed in that one binary, and under Cygwin. It would be different if you built it using mingw32. But if you wanted to cross-compile it, you could have better compiled it under other sane platforms (like Linux), or better, under Windows (if you know to setup Visual Studios or Mingw32 for ffmpeg, that is).

But why Cygwin? Because sometimes these kinds of work and process are easier here, evident by this one post alone.

When you’re ready to use your newly built program, you can copy the binary unto anywhere and run it through cmd there or whatever method of applications you can use it with, or put it on a directory included with your SYSTEM PATH (e.g. WINDOWS\system32). You can even use it for scripting purposes (that’s part of FFmpeg’s charm, or any other cmd-line programs). Just remember that you need external Cygwin libraries either on the same folder or in the system path to run this little shiny, since you technically compiled it under/for a UNIX environment. Depending on what packages you included during configuration, these includes:


If you don’t want these extra dll’s lying around, you can always cross-compile it or natively compile it in Windows. That will be another story.

To know the command parameters, run ffmpeg -h.

Some example command:

ffmpeg -i "!filepath!" -c:v libx264 -preset:v veryslow -crf 27 -c:a libfdk_aac -profile:a aac_he_v2 -strict experimental -b:a 32k -movflags +faststart -vf "scale=-2:480" -r 24 "!outfile!"

“!filepath!” is your input file, while “!outfile!” would be your output file. For more information how to use ffmpeg, FFmpeg has an excellent documentation for all its flexibility and possible use.

Happy Compiling!

Automatic Remote Connection through Deluge WebUI in Debian/Ubuntu-based Systems

In search for good bittorrent clients on ‘Nixes, one name could come up on your mind if you ever knew about these things:

Deluge is one of the most customizable and feature-rich torrent clients out in the woods. Mostly used in the most popular platform which is Windows Linux, most servers/torrent boxes used it for daily business. You can use remote shell and clients to shape and manipulate your Deluge remotely, to suit your carnal needs.

To enable remote connections to deluge through clients, there are two separate components at the backside of the program that handles this task.

These components are the daemon and web interface, called deluged and deluge-web, respectively.

There are clients that connect directly through the daemon, a prominent example of this is deluge (the client) itself, with classic interface option off. There are also clients that connect through the web interface deluge-web instead, which in turn connects to the daemon deluged. This approach might have been prefered to enable usage of web security technologies, SSH for example. Clients using this approach must have required the web interface to be configured before communication is possible through the daemon.

Deluge can be run and configured easily and out of the box on ‘nix systems after installation. But to configure telekinesis, normally you need to download and install two separate packages. You might not be able to guess the package names because you have low IQ. After somehow figuring it out, you can enable them on your deluge client, and set things ready for remote hook-up.

Now, suppose you want to connect remotely to deluge in your LAN network using other devices for lulz. You need clients that supports this feature.

But luckily, you have an android because iPhone sucks. You used Transdroid. It connects to deluge through the web interface. So, the web interface needs to be connected to the daemon. So, in every instance you start your deluge, you need to login through your web interface to connect it to the daemon. This is fine in the first run. But when your box that serves deluge is not an everlasting server, that is turned off regularly, you need to set things up again! Why can’t they just meet automatically in the usual place and hook-up already?

The solution is simple.

Assuming you have set up deluged and deluge-web on the local machine, your deluge is running, disabled classic interface and enabled WebUI plugin.
You’ll connect to the web interface for the first time. With default settings, you’d use the Port 8112. So, in your web browser you type:


Enter the default password, which is deluge,and connect to the local
daemon (or a remote daemon, if you have, for whatever reasons, like illegal ones).

Now, close your deluge client and daemon.

Open up your favorite text editor, in my case nano, and open web.conf:

nano ~/.config/deluge/web.conf

find the line that says “default_daemon”, and edit it accordingly. Assuming default port and localhost:


"default_daemon": "",

Save the file. Start deluge and deluged, and that’s the solution. Everytime deluge starts, so does deluge-web, which now automatically connects to the daemon, so when you open the WebUI, you don’t need to connect to the daemon manually. So does your remote client. Short, sweet and simple.

First Look on Rage of Bahamut: Genesis

IMAGE HEAVY. Scroll at your own discretion.

Spoilers may unintentionally be written.

Attack on Rage of Bahamut: Genesis (Shingeki no Bahamut: Genesis) is a surprise this season. Not really expecting anything from a mobile game adaptation, not to mention, a card game, not the physical one (oftentimes video game adaptations suck ass, like Yu-gi-oh!), it somehow delivers its first episodes gracefully.

Synopsis from the Internet:

Mistarcia is a magical world where humans, gods, and demons mingle together. In the past, the black-and-silver winged Bahamut has threatened to destroy the land, but humans, gods, and demons overcame their differences to fight together and seal its power. The key to that seal was split in two, one half given to the gods and the other to demons, so that they would never be united and Bahamut never released. Now, two thousand years later, the world is in an era of peace—until the day a human woman steals the gods’ half of the key.

The story follows the adventures of three characters. Amira, a demon chick who stole the half of the God’s key (I know it said above a human woman… Free spoilers!), must embark north with the “help” of the perm-haired bounty hunter Favaro, who always cross paths with his apparent arch-rival Kaisar, also a bounty hunter that came from a lineage of Knights. What would their fate have to do with the war between the gods and the demons and the threat possessed by the raging Bahamut, the destroyer of the world, is yet to be seen. And what does Amira intend to do with the God’s Key?

Let’s talk about the art. Character design is great and with a style reminiscent of those film-grade drawing style. Because of this, it certainly has the vibe of film animations (which, I insist, is a very welcomed feeling against a plethora of standard anime!) It has superb animation; fluid and consistent. It also has good utility of foreground effects and lighting, which is sported in good amount throughout the current episodes.



Now, the opening and ending songs. Every anime ought to have one. It is a very crucial part of the overall quality of the show, and I admit I sometimes skip over it. But good, very fitting, thematic, catchy OP/ED? They become part of the 23-minute episode. It becomes part of the soul. You just couldn’t skip it, if not, a glimpse of it. Bahamut’s OP rocks. It still sticks to me. I like alternative post-rock. You can’t blame me.

How about the ED song? I do like the art sequence there though. But the ED itself isn’t out of ordinary. I was skipping it most of the time. Where’s the speech about becoming part of the soul…

The soundtrack is, well, good. Nothing outstanding in particular, but it fits with the theme and genre. Not bad, I guess.

I may have one problem with one technical sector of the show, the sound FX.
It doesn’t have enough amplification and impact on some areas (like falling rocks). I don’t know if it is in part of Funimation (they were known for shitty eccentric re-encodes), but in the end, it is just a minor problem, and the good aspect of the show have already compensated for it.

The first two episodes took a great job of introducing the universe and characters, especially of Amira. I really like how they showed the possibilities of how Amira’s outfit can become (I thought I’m talking about a better point?). I haven’t personally played the game, or even knew about it, but I guess lore from the game appears here. And they did a good representation of the cards here, by the way (if I’m not wrong).

But I’m sad that they were not keeping the long hair Amira sports earlier (Twin-tails doesn’t hurt, though). Really.


However, now in episode 3, things may have started to get stale – at least that’s what I feel. The quality is still consistent – but the problem may lie in the pacing.

Nothing happened that really advanced the plot in this episode. Granted that it is not episodic in nature, it should have done something to keep expectations and interest for the next episode.

We do have some goodies though. Rita enters the scene.



Amira and Perm-hair still goes on the journey. They met up with the guy that screams Favaro. Let’s call him Favarro!!(screams). Favarro!!(screams) gets himself the Loli. (laugh)




Anyway, that’s all I can say for this episode. This was supposed to be a first look, alright?



With all the points above (I hope there are), it is worthy checking it out. Don’t get me wrong; it may have problems but so do other great anime. It’s not bad nor it is a masterpiece, but it is still better than your average anime.

Will be looking forward for the next episodes.

Unlimited Blade Works Ep. 2


Image heavy. Spoilers are kept minimal as possible.

I was quite sad that the show returned to the usual 23-minute format (no more 47± minutes episodes )-:), but as long as the episode has the right pace and consistent quality, I’m okay with it. I mean, we can’t really do anything with their scheduling, right?

I really like the fate/ series. It has this… Unique storytelling. A unique setting with unique characters, it was a unique fantasy with dark subtleties. You won’t find anything like it. And its universe is quite rich. It took a timeline and generations to show the journey in this epic story. I won’t dive deeper unto it, because it would defeat the purpose of watching the show and knowing it for yourself, you know?

As the usual, the studio ufotable did an excellent job with the art quality. The backgrounds are awesomely detailed. Even minor details are perfected. Hair movement without a doubt received the blessing from the heaven. Characters look great in this episode. Rin (pictured above) looked great. I’ve got used to her new look now.

Saber also looks great. She is portrayed here better than Fate/Zero in my opinion. If you’ve played the game (which is great), you’d find her bust to be faithful to the original source material. No, I didn’t noticed her chest first.


What I really do like is Shirou’s improved character design. He looks better and is more aligned to his character in my opinion, compared to the 2006 Deen attempt.
And as usual, the animation is top notch. Especially with this scene:


DAT ASS. Seriously, ufotable, that is a fine job for fan service. I don’t usually like fan service especially on a series with a meat and story like this, but it raised my spirits up. I liked it. Good job.

I like how they emphasized this scene, showing the pendant again. If you’re interested, this is a very important plot device.
This episode, following the source material faithfully, would feature the appearance of a recurring important character, Kotomine Kirei. Now, if you’ve watched Fate/Zero already, you will know this character and his likely role and motivation in this chapter. If you didn’t, the better. If you’re planning to watch the prequel, do it after finishing this sequel. Trust me, it will make more sense and you’ll appreciate the story more.





The scene’s lighting, if I remember from the visual novel, is quite different. It was much more brighter. But I’m fine with subtle changes; after all this is an adaptation. And I liked the change. It adds more to the dark atmosphere. It fits the aura that Kirei brings when he enters the spotlight. It is important to note that subtle notions in the character conversations here are important. The dialogue here is important to remember. You can be patient in the infodump here; infodumps are the necessary evil to be immersed in and understand this epic story.


That face though. I actually like how they portrayed Rin here. It emphasizes her character more.


But what makes this episode great (aside from Rin) is the cliffhanger at the end of the episode. God I hate love cliffhangers! Damn you ufotable for delaying berserker I can’t wait for the fight scene Ilya appears along with Berserker.


Overall, absolutely great episode. Even though it doesn’t feature a battle unlike the first prior two episodes, the great animation, attention to details and conversation already makes up for it. Expect Ilya and Berserker next episode in action. Next episode will be great, and must not be missed.

Late look on episodes 0 and 1 coming up.


Thoughts: Lollipop Madness

Just last week if I remember correctly, Google announced the next version of the Android Operating System, Android 5.0 Lollipop. It follows last years’ KitKat, which is somewhat like Windows 8 in Microsoft analogy. Lollipop, is the next major overhaul change since Ice Cream Sandwich, boasting more flatter design called Material design (this trend must have started 2-3 years ago… so familiar…), performance and stability boost, better notification handling, “better” multitasking (which is just, you know, a visual upgrade), and other small, under-the-hood improvements. This update can be like Windows 10 in Microsoft analogy.

Now, let me talk about a couple of things concerning this news.

The name. I was kinda bummed here. They obviously chose the most obvious of all sweet things! I thought Key Lime Pie was already great, which turned out to be version 4.4 KitKat. They should have continued the trend that starts with Gingerbread, Honeycomb, Ice Cream Sandwich and Jellybean. Not only those are not that remotely popular (for all around the world; reality check), they are also unique, so Lollipop broke that sacred naming convention. About KitKat, don’t ask why.

This part is terrible. Why the heck is it that when it comes to its design, the irony is that it doesn’t even come to the low-end devices, and more importantly, those old flagship devices, where it is needed the most? It was designed to be more optimized in lesser memory, have a more efficient, faster binary runtime that requires lesser memory (in the form of ART), better battery optimization in the form of baked-in greenify-like feature in the system, and for ART that’s supposed to lessen processing load.

So, clearly it carried the banner that started with KitKat that said lesser fragmentation and uniformity across devices. Wait, that is the supposed intention? Then why is my Galaxy Nexus isn’t supported? They missed a great chance to explore the power and possibilities of these new operating systems by making them available to those not-so-old, capable devices.

Now, reasons for these are apparently obvious – Galaxy Nexus having no updated drivers for the kernel, OEM and carriers not updating older devices, unavailability of support for third-party hardwares, low storage spaces for older devices because ART takes more space (which can be solved through external storage), money and profitability – but not taking the chance to let these perfectly capable devices, these very same devices that gets the most benefit, to get a lick and suck to the oh, so delicious Lollipop (It wasn’t made intentionally perverse in anyway), felt kinda frustrating. It could have cement more footings for android; it could have benefit Google and its partners, the developers, the whole freaking ecosystem.

Community developers had made KitKat to work with Galaxy Nexus, and I wouldn’t be surprised if they did the same thing for Lollipop.

And, some of the initial roll-up for Lollipop is the new devices that come with it (obviously) which is the Nexus 6 and Nexus 9, and coming weeks later are Nexus 5, Nexus 4, Nexus 10, Moto X, and even freaking Nexus 7. Yeah, that device from 2 years ago, earlier than Nexus 4. See the possibility?

This is a long post, actually a rant post, which is unusual for me, but this is just a side effect of my excitement on this Android release. Especially on the thought that my Galaxy Nexus could get Lollipop after it was given a new breath of life lately through community developed updated kernel and drivers. I’d continue to watch the community from the shadows and maybe I can share more information regarding these events.