King James Version (KJV)

Blog

In the order of 3 weeks

Posted by Marius Oberholster on Thursday, July 13, 2017 Under: WIP
Hey all!

This week's vlog:


GOD is so awesome!!!!!!

   HUGE NOTE:
Of course, as with all software recommendations on this site, through-out all time, results may vary and you use them completely at your own risk. I only share what has worked for me personally and can't make any guarantees for anyone else. In fact, I have to warn you that even something that is supposed to work out of the box, can cause you to have to format and lose all your data (almost lost all my data before due to trying to install dependencies for a program). Please please please - if you're not sure, ask someone that knows or find something else.

Okay, so with that covered, we can move onto the interesting part - the post!


 - Been away
   Over the passed two weeks we moved. That meant that the first week's vlog went out the window, because my PC was packed and the second week, because I just wasn't up to a vlog day as well as the unpacking process. That did not mean I didn't get to anything in relation to Exodus, it just meant that I didn't post about it. A lot happened within that first week and in the second week as well. We also have a different data plan (capped, but only for the moment), so I have to reduce the resolution and bitrates of the vlogs as well, so that they are as small as possible, without compromising on the message :D! This shouldn't really reflect on your normal watching habits, as most people watch even tutorials at 360p!

 - Super Resolution
   I felt led to go digging for upscaling solutions, as well as frame interpolation resources and quite frankly, the biggest surprise was when I came across the first golden nugget called Google Raisr. Basically, it is a program that knows how to scan images for direction and interpolate smartly, based on those directions. Things that have great contrast can now be upscaled to have crisp edges instead of pixel stairs! It's the best of anti-aliasing and interpolation all thrown into one. Super Resolution - this level of upscaling - is nothing new. In fact, there are a few methods that can be followed.

   One of my personal favorites that is also one of the most obvious that most of us would miss, is the frame-seeking method (my own way of describing it). What this means is that the program goes through a video and checks following and preceding frames for more detail information and combines it all to give you a truly higher resolution image. Great candidates for this sort of upscaling will be panning shots, zoom shots, long speech shots where the characters and camera move a little bit, etc. I have tried one of these programs, but personally have not been able to reproduce their own results, but thankfully, they don't claim something that is unrealistic. I do believe the software can truly do what it says it can, I just don't have access to said footage. I would like to have been able to reproduce their results, but I only got a mild sharpening, nothing more... I sure would like to be able to try something that combines both searching and single frame information (if all of them don't do that already, hahaha)!! My theory is that it would kinda de-block within the first few frames (depending on its scan threshold) and have fuzzier areas where no additional detail could be found - yet still sharp because of compensation methods that also upscale using only the current frame.

   I did find one application that does a very good job and for free as well, and it is called Reshade (cool name and easy to use as well - read the manual) by Vlad Hosu. Using some clever tricks in Blender, you can also upscale images, but you'd be hard-pressed to reproduce what this program can (I compared what I got with what his program can do). Btw, this is in no way sponsored, I'm just amazed by the technology that is currently available in this area and telling you guys about it. I am very thankful GOD had me on this search, because it allows me to do more in the future than I can right now, because of hardware and time limitations. Yes, I do realize upscaling is not the same as actually rendering at said resolution, but I'm sure with the frame searching method, you get pretty close to true resolution if you have very clean footage and a great "search engine", hahaha! :D
Reshade can be downloaded here:
http://reshade.com/

 - Frame interpolation
   Another one that is quite a challenge. Of course, the first program that comes to mind for this would be blender. I mean of course, it is a 3D suite, but there is a bit of a problem here. When you want to interpolate, you have to analyze the video's frames as well as the codec's block movements in order to calculate a visual speed. This is also often done at the pixel level too. Then, when there is motion determined, the program does it's best to distort a mix between the two, to give you the best possible new tweening you need. This often has areas that do well on-screen and areas that don't do as well - depending on how it's written, the footage, speed of the action, atmospheric effects, transition effects, etc. You name it - everything visual affects it. The less the interpolator has to do, the better the result will be, because it operates without bias. It only works with what it can "see"; it doesn't know a foot from a building. This makes sudden flashes of light and other bursting forth type action sequences, will often have weird wabbles around the edges.

This kind of visual learning for motion, regardless of accuracy, is not in Blender. There are some ways to create difference masks and blend frames and so on, but that only creates a smooth motion based on two frames laid over each other. It also looks great, but it doesn't do well with fast motion and it doesn't really create the in-between frames - it just creates a sense of motion blur. I tried to figure out a way to make it know left from right based on normals and the differences between frames, but yeah, let's just say I don't know nearly enough to produce such a result... hahaha. I don't know how to tell Blender what is left, right, up or down.

That meant we only had one avenue left! Off to Google! hahaha. So I went online to go look for something that would be able to process this sort of thing and there are a lot of really great options. One of the best I've seen is Twixtor. I have nothing it can work in and nr2, I can't afford it, haha. So while it produces fantastic results, it's not an option for me. Since I tried so many, I've made a list instead of going on and on about each one:
 - Blender (no visual learning - some masking helps, but this one requires specific mention)
 - Movie Maker (general blending only, though great for frame rate conversion) (Now discontinued)
 - Slowmo Video (too low quality output, but awesome morphing result - I tried upping the quality with the VCodec options from FFMPEG, but nothing worked...)
 - GoPro studio (Flux just blended the frames - no morphing - possibly too little info?)
 - Butterflow (freezes on encoding - no compatible OpenCL device, but awesome result when working - see website for examples)
 - MotionPerfect (old reliable indeed, but fails on brightness bursts too, to the point of being unusable for me - but stunning overall none the less)
 - SVP, because of its supreme quality, would be awesome, but no dedicated or free converter (there is a free one for AVISynth, but non-commercial use only without a license)
 - AVISynth plugin called InterFrame (based on SVP) and AviSynth MSU Frame Rate Conversion Filter (all again only free non-commercially..., but their results are beyond words - examples on the sites)

Now remember one thing - I am not knocking these solutions in any way (you'll see Blender is up there too and I did not slam any program). I'm simply saying, these did not deliver what I wanted personally out of the program. For all I know, one of the above is perfect for what you want.

 - Splash 2.0 - There is an additional one that I did not try out, because it has both upscaling and interpolation and that is called Splash 2.0. I haven't seen it's results, but it's website is certainly promising. It certainly sells itself well and isn't expensive, but is also out of my budgetary reach atm. I would like to be able to use it, but should it expire and become useless in that regard, I would've wasted a precious option... No examples on the website.

So with all the hours invested, do I have something that does work for what I'm looking for?
Depends on how you would define what I was looking for. I wanted something that had a nice interface, gave me access to advanced settings and zoom previews, etc, but didn't get any of that, haha. I got the opposite of what I wanted, which is commandline and scripting! haha. GOD still gave me the tools I needed, but it certainly did not come in the package that I wanted it to be in!

So, ultimately, yes, GOD certainly made sure I got what I needed and it's called MVTools (much more than just interpolator). It is a feature rich plug-in for AviSynth (A frame server). AVISynth, for those who don't know, is a way to script changes into videos and have those changes reflect in realtime by opening the script in a video editor or video player - where you can then save these changes to a new video (VirtualDub is the most common script applier when it comes to AviSynth). AviSynth itself has no UI nor does it have a scripting interface. You write a script for it in NotePad and you save it as an *.avs. This then opens up in a video player or editor and shows you the changes that were made. AviSynth alone is sort of like watching TV in the old days - no way to save it, when it's done it's gone, haha; only difference is you can watch it again.

In addition to MVTools, I needed a conversion solution - script to video and GOD also provided that in the form of FFMPEG. AVS2AVI was my first golden nugget here, but it couldn't save to h.264, which is a major problem for file sizes. FFMPEG is of course still a commandline convertor and file size is still huge (though a lot smaller) to retain quality. I wanted a UI, because I don't know a lot about these things, but I am telling you, GOD is pushing me to know more about this stuff! haha.

 - Here you can find each program mentioned above, in no particular order:
MVTools (AVISynth plugin) - http://avisynth.org.ru/mvtools/mvtools2.html
FFMPEG - http://ffmpeg.org/
AVS2AVI (commandline script to video converter) - http://www.moitah.net/
Slowmo Video - http://slowmovideo.granjow.net/videos.html
Splash 2.0 - https://mirillis.com/en/products/splash-free-hd-video-player.html
Windows Movie Maker - (Discontinued)
Twixtor - http://revisionfx.com/products/twixtor/
GoPro Studio - https://shop.gopro.com/EMEA/softwareandapp/quik-%7C-desktop/Quik-Desktop.html
Butterflow - https://github.com/dthpham/butterflow
MotionPerfect - http://www.softpedia.com/get/Multimedia/Video/Video-Editors/MotionPerfect.shtml
SVP - https://www.svp-team.com/wiki/Main_Page
Interframe (based on SVP) - http://www.spirton.com/uploads/InterFrame/InterFrame2.html
MSU Frame Rate Conversion Filter - http://www.compression.ru/video/frame_rate_conversion/index_en_msu.html
Blender - https://www.blender.org/

   Let's just refer back to Blender for a second. There is a node called InPaint. What it does is expand the borders of an image into it's alpha areas. That means that it can produce directionality based on alpha edge and fill in based on difference. This means you can, for general slower motion objects, produce something similar to frame interpolation, but it is a very dirty result. It's not like the TV's do on-the-fly; nope, it has its own brand of artifacts that make it generally unusable overall for interpolation. On top of that, you also need to render a doubled version of your sequence to add in-between frames. It makes it in general a very impractical method, but ironically, still funcional for slower moving things. If you have rapid zoom shots, this is certainly not what you are looking for and it will not do an awesome job of taking your 4fps video and making it 60fps. It just wont, haha.
   While on the topic, I properly interpolated a video like that for the vlog, just so you guys could see an extreme comparison of what the program actually does (from 10fps to 30fps - yes, an astounding 20fps added from only what is there).

   The results from MVTools are astonishing to me! On the website, they have example scripts that allow you to kinda pick what you want and I went for the 'best' quality (as they recommend), even though it is considered a lot slower (at about 4fps encoding at the fast side, I agree, it's not realtime, but still a lot faster than 3-5mins per frame). It looks like those frames were truly rendered (there are artifacts and it also struggles a bit with bursts of light, but it really does an amazing job). You can make up your own mind whether it fits what you need.

It's tempting to mention a downside when it comes to the colorspace, the reason being that it is mentioned online that there is a slight degradation in color with the filter's preferred spaces (YUY2 and YV12), but the so-called degradation in color is, as far as I could see, only a slight upping in brightness, which can easily be adjusted, as long as the main interpolation looks good and the quality of the export is good.

MVTools is definitely my interpolator of choice!!!

Just a headsup if you didn't know. AVISynth can convert videos to the right colorspace within the same script, before the MVTools plugin is applied. So the script could read something like:

DirectShowSource("<Source>.avi")
ConvertToYUY2()
<MVTools settings that you can find on the website link above>

Then you chuck that through FFMPEG (32-bit) to a new output AVI and you have an interpolated video.

Again, using these things is entirely at your own risk!

 - The reason for all this
   YouTube has had 4K support for a little while now and 4K typically runs at 60fps. My little PC can't cope with that. If I wanted to render at 4K, I would have to reduce frame rate or something else to compensate for the increase in render times. I really do believe GOD wants me to upload a 4K, 60fps version of Exodus. I can't imagine another reason for HIM to ask me to dig into interpolation and upscaling to such an extent.

There is a bit of a trade-off when it comes to conversion. Some will love it, and some will hate it. Regardless of which side of the fence you fall on, I truly believe that improving existing content in real-time (or through conversion) to fit new formats is a better way to go than to just keep pushing new formats (interpolation, upscaling, 2D to 3D and so on, are just the main examples - you also get color enhancement, comb filters and other unique tools that can enhance older content - even black and white to color, though this requires a much higher level of sophistication and recognition of shape to remain stable in playback - more about that later).

Also, remember, I am led to use these tools, because my hardware just doesn't support anything else atm. I can't render directly to 4K at 60fps. I know GOD has better for me when this is done, but I have to prove faithful and finish it first! :D

Just to give you some figures for the same frame (a fairly quick one) at various resolutions (Render and composite times - optimized for speed, except tile size):
4K - 08:25 (mm:ss)
FullHD - 02:21 (mm:ss)
50%HD - 00:57 (mm:ss)
It's interesting to note that the times more than double the higher the resolution goes - to insane amounts! hahaha. I also entertained the possibility of upscaling from 50%HD to get the render speed even faster, but the only words I have for it are eeeww and gross! hahaha. I showed that in the vlog and it's just not pretty. One format has to upscale to another for the best results - no skipping! haha.

Just to show what it looks like, I upscaled and interpolated the intro (not much movement, but the text it clear, hahaha).



GOD is so great!

 - Black and white to color
   A while ago I put up a questionnaire on the website that had a few picture on it that were in black and white. I asked people to look at those pictures and tell me what colors they believed the pictures were. To a degree, as a researcher, I felt it was a moot point to some extent, but I still wanted a varied opinion on what people perceived black and white pictures' colors to be.

Why moot?
   Well, to be honest, we have lost all RGB information. Red, green and blue can all be the same brightness and virtually indistinguishable from another in a black and white image. This means that just because you have a black and white image, does not mean you can pull out color information. Let me illustrate that with a point. If you have the entire rainbow at the same brightness level, you will notice that when you desaturate it, the whole thing becomes a solid gray block. You have no way of knowing that these color ever changed, much less which was what. A friend argued that one color is often a darker color, but in black and white, you can't make such an assumption, or shadows will always be one color, while bright areas will always be another, giving you a basic two tone gradient anyway. This means that while you can recolor black and white images or video, you can't simply apply a typical algorithm to it and boom, fixed, like you can with 2D-3D or something similar.

Findings
   Well, the findings told me one thing - everyone's taste will influence their decision on what colors to use for what. We have the basic understanding of period styles and of course the color wheel that tells us what goes well together and what does not. This gives us a reasonable level of accuracy, but artistry certainly comes into play, because quite frankly, varied opinions on color mean that the computer will not always be able to guess the color correctly just based on those assumptions.

Solution
   In one term: Object Recognition. That is truly the only way to get a computer to be able to add the color back to a video. It has to be able to distinguish between a person and a chair, a wall and a floor, varied breeds to dog, a single vehicle from frame to frame and scene to scene, so the dog doesn't suddenly change color. You would need one heck of an AI or massive team of artists or just some basic color balance tools and motion tracking to help you get a good result. I vote for the latter, at least for now, because with the human element, you'd at least be able to retain some level of continuity throughout the video. One person isn't going to make a car green in one scene or frame and in the next a shocking pink - if they know it's the same car - silly example.

Attempts?
   One. I wanted to give this a shot using the keyframed masking we have in Blender, but because we don't have the color balance tools from Gimp, I don't know how to colorize each basic level of shading (shadows, mid tones and highlights) with varying tones to give a better result... So for me, unfortunately, it failed even on the first frame, hahaha. I do believe I just need a better understanding of how Blender's tools work, seeing that video is not the same as still photography, but we'll see! :D

For interest sake, if you want to fill out the study, I would like to see what you experience! :D
http://www.pantherdynamics.yolasite.com/study---color-picks.php

Also, some really cool advances in this field - like I mentioned above, this does require learning on the program's part in order to be able to do it and it look good and I am really surprised at how good these examples are:
 - http://demos.algorithmia.com/colorize-photos/
 - http://gizmodo.com/this-software-creates-vivid-color-pictures-from-black-a-1768422767

(I don't have a clue on compiling, so don't ask me about how to build and use it, haha - this one gets a special notice - use of any and all software recommendations are completely and wholly at your own risk!)

 - 2D to 3D artifacts
   Anyhow, back to Exodus, haha. With the 2D to 3D conversion, I did find some serious artifacts. Nothing that will be super annoying in every scene, but did prove rather distracting in one already rendered.
   Things like stretching and distortion are commonplace when it comes to conversion from something that lacks information to trying to recreate that information from scratch. That is what happens when you convert from 2D. When you have a proper 3D image, you are looking at two different angles of the same thing (two eyes). This means that you need the information for what is behind certain objects, in order for your eyes to have all the information. That information doesn't exist in a single angle shot or 2D image. There are multiple ways of reconstructing it, but for Blender, the best thing is to just blur or stretch these pixels to fill in the area. This can, in extreme places, create a glass-like warping; that is what has happened with Moses staring at the palace.
   If you look to the left side of his head, you can see a warped area (not super obvious in still image, but very obvious in video, haha). In Blender, I just don't have the tools I would like to, to reconstruct these areas, but I can say that I am thankful for the tools I do have and that I can at least have the stretched pixels. It makes a huge difference, as opposed to the tearing that was there! haha.

   One last thing - I am looking into atm is the InPaint Node I mentioned earlier. If it can be used to compensate for the lack of info, that'd be awesome!! So far, attempts at 3D artifact reduction have been difficult to say the least, haha.

 - Voice actors
   Well, this isn't a fun point to finish on. I did not hear back from most of the people I contacted to play the parts. Not that I haven't spoken to them since, but they just decided not to respond. I have had one respond though and he was willing to do it, but just didn't... It's been over a month and nothing... I wanted to contact them about it, but I didn't have any peace about it and time just ran out. I had to record all the parts and change the voices as best I could with the software and gift I have and those will be the final voices... I'm truly disappointed, but I have to believe that there is a reason GOD allowed it to turn out the way it did. I'm not mad at anyone, because I know that they are extremely busy and I also know that what I have asked, would take a lot of time. I just wanted a response, you know. Maybe I phrased it wrong or something. I don't know... Even the one that responded sounded like he wasn't sure he ought to be doing it...

   Thankfully, HE did not leave me without an answer. I downloaded a demo plugin a while back called Elastique Pitch V2. It is a magnificent plugin that can change the timbre and color of your voice with very good results. Of course, there is still some element of robotics in there, but to me, probably the best formant shifter and voice changer I've ever had the fortune of using!

Elastique Pitch V2 - https://products.zplane.de/elsatique-pitch-2
Voco - PhotoShop for Audio - http://www.bbc.co.uk/news/technology-37899902
Previous vlog with voice processing examples - https://www.youtube.com/watch?v=mZMXP0u6SEE
Screaming Bee voice changer - http://screamingbee.com/Product/MorphVOX.aspx

   So, in closing for these three weeks, I want to just remind you, there is only one voice actor in the video, so if the voices sound similar, it's because they are, hahahaha. I will still keep it before GOD to make HIS part HIS and to make the rest sound unique.

Biggest thanx to GOD for helping me with this. Without HIM, none of this would be possible! :D

Know JESUS yet? GOD is reaching out to you through these videos. HE loves you more than you can imagine and wants you to know HIM. Not because HE's lonely, but because HE loves you and paid a heavy price to enable you to go to Heaven, where HE is. See, we all have sinned in some way shape or form. That makes us guilty before GOD. If you have ever said that you hated someone, in GOD's standard, you have committed murder (James 4). If you've ever looked on a person with lustful thinking, you've already committed fornication with them in your heart and you are then guilty of sexual impurity. If you were to face GOD on judgement day, saying: "I was a good person" will be a void argument, just like it would be before a human judge. GOD knew that we would have to pay that penalty and that we could not pay it, even if we tried, but sent JESUS to pay the price for us. Your sins were put on HIM and the forgiveness that is there for you was by no means cheap - the cross was not padded and comfortable, neither was the death.
But, HE was raised up on the third day (after being crucified) and finished the work so that we may enter through HIM paying the penalty we were supposed to pay, but it is only valid for you, if you accept it and live for HIM.
If that is you today, please, follow this prayer:
"JESUS, I believe that YOU are real and I believe YOU died for me and was raised on the third day for me. Please forgive me for my many sins. I make YOU the LORD of my life, full time and to fill me with YOUR HOLY SPIRIT. Teach me how to live for YOU, help me become what YOU have made me to be, in JESUS' Name! Amen!"

To find out more, you can check out this link:
http://www.crossallegiance.org/knowJesus.html
It gives you access to free Bible translations, free teaching videos and more. Remember, this is a decision with eternal consequence, choose life, choose JESUS. :D

Have a great one!!!

Thank YOU!!!!!!

In : WIP 


Tags: god  jesus  holy spirit  blender  anime  exodus  vlogs  7-12  kjv  interpolator  super resolution  upscaling  4k  hd  sound  voices  elastique pitch 

About Me


Marius Oberholster Hey all! I've been doing CG work full-time since 2011. GOD has been with me, supported me and taught me all the way. HE tells us to do all things as unto the LORD and that is my goal. I do whatever I believe the HOLY SPIRIT is leading me to do and it's not always easy, but it's always worth it to be obedient! JESUS is awesome!! If you don't know HIM yet, follow the link above!

Make a free website with Yola