I moved ffdshow's Avisynth filter down to the bottom of the filter list, enabled it and typed Info() into the text section. I experimented a little with an Avisynth script outputting YUY2 by opening the script with MPC-HC and using ffdshow to process the uncompressed output. The ffdshow Avisynth filter accepts YV12, YUY2 or RGB, but try typing Info() into the Avisynth filter, enable it and have a play, assuming you have Avisynth installed. In their stand-alone form, some Avisynth plugins support more than one colour space, while some are YV12 only, and you'd hope if a ffdshow filter supports YUY2 it'll be processed without converting it, but I don't know for sure if that's what ffdshow always does. Does this mean that "High-Quality YV12 to RGB Conversion" works with YUV2? Note that the Dithering option can only be enabled when the "High-Quality YV12 to RGB Conversion" box is checked.Ģc) Does Dithering only work when converting from YCbCr to RGB24/32? What about RGB64? Is it redundant to have the Deband filter turned on when the LAV Video Decoder is outputting YUY2 (see question 2b above)?ģ) When outputting to YUY2 solely in the LAV Video Decoder, does any kind of dithering happen? Is there any way I can turn this off, so that ffdshow's raw video filter can take care of dithering instead?Ĥ) Does ffdshow's raw video filter support all of the colorspaces supported by the LAV Video filter? If not, would anything bad happen to the video's colorspace? Would you recommend that I should just output YV12/YUY2 from LAV and then convert that to RGB32 in ffdshow?ġa. The HQDN3D (aka Denoise3D) README page here ( ) says that the 0.11 version only works in YV12.ġb) What about the Avisynth filter? What colorspace does it work in?Ģa) Does RGB Conversion in ffdshow happen before all the other filters or does it happen last?Ģb) In the RGB Conversion menu, do the Dithering and "High-Quality YV12 to RGB Conversion" options only apply to YV12?Īccording to the documentation here: ( ), Dithering also works on YCbCr colorspaces. I can upload a debug version if you'd like to help, because I do not have a capture card.I'm a complete newb when it comes to colorspaces, so answering any of the questions below would be of immense help to me.ġa) Is it true that ffdshow's raw video filters (more specifically the Resize & Aspect, Denoise3D, and Deband filters) only work in the YV12 colorspace? Or does it support YUY2 as well? Playing a video and then activating the EVR-CP stats page (shown by pressing Ctrl+J in MPC-BE/HC) shows that mixing is done in YUY2, but look at this: frame rate, color format, duration, etc). If it still crashes, I need more information (e.g. If anyone finds a use case where "out of sight" frames are needed, I could provide option to save those extra frames.Ībout the crash, first please try the current archive. Whenever a seeking happens, I just flush the buffer to avoid the "ghost frame" problem I had with ffdshow.īecause this mechanism totally relies on the upstream (or more specifically splitter), I do not provide any option for it. Like ffdshow, if any frame is requested outside of the buffer range, the nearest one is returned. Once a frame is "out of sight", it is discarded from buffer. This filter just saves all frames between the tip of the prefetch and the current play timestamp. The heavier the system is under load, the shorter the prefetch becomes (could even lag behind, where it starts to drop frames). madVR) caches whatever it receives and presents them when the time comes. In DirectShow, the upstream filters usually process a few frames ahead of time (like prefetch), and send them down to the renderer with a "render timestamp". I implemented a similar buffer in the filter. According to their wiki, it is used to allow filters to access a configurable range of frame within the current position. I reuploaded the archive.Ībout ffdshow's buffer option. I did mess up with the 圆4 release though (it had a wrong old version). I tried AssumeFPS, ChangeFPS and ConvertFPS in MPC-HC.