You are here

QTractor's Gaping Hole of Video Demonstrations

For me, the most exciting part of a multi-track sequencer is HEARING the results.

At the moment, the Qtractor website features:

  • great screen shots
  • an exhaustive manual
  • a clearly-explained change log (something missing from MANY Linux projects)

However, Qtractor is missing a very basic but POWERFUL method of getting people excited...VIDEO DEMONSTRATIONS!
For me, the most enthralling part of discovering new software is SEEING someone USING it to create amazing results!

__________________________________________

Here's are two video demonstrations of similar software to QTractor; called Open Octave:

======================
Video Demonstration Examples
======================

1.) Using Open Octave's GUI:
http://www.youtube.com/watch?v=ZcKz3Aw8zP8#t=19s

2.) Some of the dramatic results possible with Open Octave's tools:
http://www.youtube.com/watch?v=iv65yBQIdOg#t=36s

I definitely prefer QTractor to Open Octave, because (among other things) QTractor is better combined into one package. (and I love how it remembers my Jack settings)

Open Octave require running scripts and other complications, while QTractor allows me to get started faster.

__________________________________________

So if you (and perhaps the QTractor community) would take even a LITTLE time and make some BASIC video demonstrations of the music creation possible with Qtractor, I think it would be a HUGELY beneficial addition to the website.

I would suggest YouTube, because other video hosting services often take awhile to buffer. (Vimeo!! >_>)

Thanks for reading.

Forums: 
rncbc's picture

Astonishing!

You sure are harnessing the possibilities to an extent that I would dim impossible. Incredible! One should not ever underestimate a power-user's imagination ;)

I stand in awe, driveling :O~

Thanks a lot.

P.S. If not asking too much, I would be interested in a tutorial-of-the-tutorial, meaning how and which specific steps did you take for making the screencast with ffmpeg, jack_capture and xephyr and how did you made the final production with the titling and that trivial things :) I guess you edited the video as so much to make sure is in sync with full audio, something that I found very hard to achieve with RMD without post-editing and stretching the video to fit the audio anyhow.

AutoStatic's picture

Thanks Rui, but I had to take the video offline because I mixed up the terms formant and carrier. I will upload a new video asap. Sorry about that. And about the tutorial, I've used OpenShot to edit it. And because I used jack_capture and ffmpeg audio is already in sync. I used Xephyr so I could create an X session with a 1280x720 resolution because then the result will not be stretched or shrinked. But I will try to create a tutorial of the tutorial also :)

Jeremy

AutoStatic's picture

http://www.youtube.com/watch?v=tusCeI1aQ4c

I mixed up formant and carrier in my previous video (which I've taken offline btw) so I've redone it with the correct terminology.

Best,

Jeremy

AutoStatic's picture

1. Make sure you have a recent version of ffmpeg compiled with support for the h.264 protocol and that you have jack_capture installed. You also need Xephyr and a video editor, I use OpenShot myself.
2. Set up a Xephyr nested X server with a resolution of 1280x720 as this resolution yields the best result when uploading the final video to YouTube. I use a little script for this:

#!/bin/bash

# Set up nested X server
Xephyr -keybd ephyr,,,xkbmodel=evdev -br -reset -host-cursor -screen 1280x720x24 -dpi 96 :2 &
sleep 3
export DISPLAY=:2.0
/etc/X11/Xsession &

3. Start your screencast within the nested X server, I use a second script for this:

#!/bin/bash

DATE=`date +%Y%m%d`
TIME=`date +%Hh%M`
export DISPLAY=:2.0

# Start screencast
(sleep 3;qtractor $HOME/Screencasts/Screencast.qtr) &
xterm -display :0.0 -e jack_capture -b 24 $HOME/Screencasts/screencast_audio_$DATE-$TIME.wav &
ffmpeg -an -f x11grab -r 30 -s 1280x720 -i :2 -vcodec libx264 -vpre lossless_ultrafast -threads 4 $HOME/Screencasts/screencast_video_$DATE-$TIME.mkv

killall jack_capture

4. Now you have two separate files, an .mkv and a .wav file and because jack_capture was started right after ffmpeg audio should be no more out of sync then just a few millisecs.
5. Edit the .wav file in Audacity or Rezound if necessary. I used some extra gain and compression.
6. Import the two files in OpenShot and crop start and end to your liking.
7. With VLC I made some stills from the video file and with Gimp I edited these to make the titling. You can also use Inkscape.
8. Create transitions, fade-ins and fade-outs to your liking with OpenShot.
9. Render/export the project. I use the following settings:
- Video: 1280x720, 30 fps (this should match the settings of ffmpeg), mp4 videoformat, 40 Mb/s (Blu-ray quality), libx264 videocodec.
- Audio: mp3 (libmp3lame), 256 Kb/s and 44.1 Khz sample rate.
10. Upload to YouTube, set it to private, wait for the vid to get processed and if the result is satisfactory set the video status to public.

Hi there

Just one question to the developper: would it be possible (in a near future) to use songs recorded in Qtractor on computer 1 on computer 2 without having to systematically use the same paths to the songs!!! It is impossible to always remember exactly where some items have been saved on one computer when you use another system!

Thanks!

rncbc's picture

Q. would it be possible (in a near future) to use songs recorded in Qtractor on computer 1 on computer 2 without having to systematically use the same paths to the songs

A. Yes. Qtractor sessions might, one day in a near future, be bundled into one single zip file. All files will get copied under the same (maybe temporary) session directory and zipped into a single archive. That's the plan. I might get there sometime but to be honest it's not my highest personal priority ATM ;) BTW, it would be awesome if someone steps in for the job. It's free open-source software development isn't it?

Cheers

rncbc's picture

Yess, the day has come, future is here now ;) starting from svn trunk's rev.1705 (qtractor 0.4.7.10+) one can save a qtractor session into an all-in-one archive/zip file. The new file type is now designated through its new suffix (aka extension): .qtz.

Try it while it's hot! Have a go through it from the standard File/Save As... action.

Nevertheless, always remember the mantra: backup, backup, backup! There's hope that this brand new session archive file type (.qtz), which is a regular zip file for all matters, will be your session backup savior. Take care that this is all a pretty new option feature and it should, er... must be tested ad nauseum first ;)

Cheers

AutoStatic's picture

Made a screencast of a track that I finally sort of finished. In the screencast I've focused on using Qtractor as a mixer. As one of the very few apps supporting practically all plugin frameworks and with its flexible routing possibilities mixing with Qtractor just works very well. I also did the automation for the track in Qtractor by using the new MIDI mapping features for plugins.

The Infinite Repeat - Unaware of a Direction

Best,

Jeremy

rncbc's picture

Jeremy,

awesome song, awesome video, yeah awesome as always

thanks for your awesome support :)

cheers

AutoStatic's picture

Hello Rui,

Well, thank you for the awesome Qtractor!
Next video will be about setting up a mastering chain in Qtractor I think. I just don't really get a grip of JAMin.

Best,

Jeremy

really nice Video demo Jeremy!
At the end, may I know how you have video recorded?
with desktop recorder + Jack connection?
I need to know too how to recorder my desktop and also connect the Jack audio.
Please, explain me the package that you have used for recording the desktop that I ask 64studio to include on my OS 5.0 with Kubuntu 10.10 distribution
Best
Domenik

AutoStatic's picture

Hello Domenik, thanks! The screencast was recorded with ffmpeg and jack-capture: http://www.rncbc.org/drupal/node/219#comment-3859. A worked out version can be found here.
You could also use recordMyDesktop but when you upload Theora encoded video to YouTube it will degrade significantly.

Best,

Jeremy

AutoStatic's picture

I've uploaded a new video to YouTube for the most recent track I made. Again I used Qtractor as a mixer and plug-in host. And I recorded some extra parts and overdubs with it that I imported as samples in Hydrogen afterwards so I could easily trigger those samples with seq24.

The Infinite Repeat - Money or Love (DJ AutoStatic Remix)

Best,

Jeremy

rncbc's picture

Awesome! (where's the "like" button when one needs it in this crap of forum? :))

This video alone may be very well regarded as a shinning proof of what one can do with some of the Linux Audio/MIDI paraphernalia nowadays.

The result (the song itself) is, in my deepest and sincere opinion, world-class and it certainly demonstrates a lot more of human talent than any kind of hard or software quality and/or capability.

I envy you, Jeremy ;)

Cheers

AutoStatic's picture

Hello Rui,

Thank you sir! Guess I'm on the right track!

Best,

Jeremy

AutoStatic's picture

My entry for the KVR One Synth Challenge 26: ZynAddSubFX

The Infinite Repeat - The Speeding Train

All instruments, noises and sounds generated by ZynAddSubFX/Yoshimi.

Best,

Jeremy

rncbc's picture

yada yada :)

ok. we have an issue here Jeremy.

you seem to be triggering qtractor from seq24 using the "monitor" facility over a MIDI track. right? check?

well, this is a bit of embarrassing to me, but, unless your jackd settings for audio frames/buffer are pretty short and small, say 64 or 128 frames/buffer, i'm afraid you *do* have a midi timing quantization effect on the results on your performance.

i must say that the qtractor MIDI monitor feature for instrument plugins are sort of an impromptu helper, to check if things are being correctly routed and, yes, captured. but not for an influx streaming, let alone a performance.

let's do an experiment: gently increase the jack period to 1024 frames/buffer for instance; then try to reproduce your composition. you'll certainly notice that the timing of all or any of the zynaddsubfx plugins aren't, say, the same. terror, caos, mayhem... the "groove" feel is, most probably, utterly broken and, adding injury to insult, it varies each time you play ... yep, that's what i'm talking about ...

i guess i'll have to cope to this new "trend" of using qtractor not quite as a sequencer per se but as a audio/midi mixer, ain't i?

cheers && great video btw (yada yada)

AutoStatic's picture

you seem to be triggering qtractor from seq24 using the "monitor" facility over a MIDI track. right? check?

Yes. I'm using seq24 as the main sequencer for this track and connect the ALSA MIDI outputs of seq24 to dedicated MIDI tracks in Qtractor to which I added ZynAddSubFX-DSSI plug-ins or inserts to throughput the MIDI messages to separate Yoshimi instances.

well, this is a bit of embarrassing to me, but, unless your jackd settings for audio frames/buffer are pretty short and small, say 64 or 128 frames/buffer, i'm afraid you *do* have a midi timing quantization effect on the results on your performance.

I'm using either 64 or 128 frames/buffer.

i must say that the qtractor MIDI monitor feature for instrument plugins are sort of an impromptu helper, to check if things are being correctly routed and, yes, captured. but not for an influx streaming, let alone a performance.

So basically it's better to do it another way? For this track it worked ok, but if it's far from optimal than I'll try setting things up differently in the future.

let's do an experiment: gently increase the jack period to 1024 frames/buffer for instance; then try to reproduce your composition. you'll certainly notice that the timing of all or any of the zynaddsubfx plugins aren't, say, the same. terror, caos, mayhem... the "groove" feel is, most probably, utterly broken and, adding injury to insult, it varies each time you play ... yep, that's what i'm talking about ...

Ok, I'll give it a try. First I have to compile svn revision 1968 regarding the audio glitches me and my band experience when recording >4 tracks simultaneously ;) I guess this might be the cause why I couldn't use Arpage to do the arpeggio in the bridge, it was hopelessly out of sync while on my little netbook it worked fine (on my netbook I used one single instance of Yoshimi for all the instruments).

i guess i'll have to cope to this new "trend" of using qtractor not quite as a sequencer per se but as a audio/midi mixer, ain't i?

Qtractor just works brilliantly this way: great routing possibilities, support for every single native Linux plug-in platform, MIDI controlling. But yes, I guess it might be better to compose in seq24 and then record the MIDI output into Qtractor's sequencer and move on from there. But it's less flexible. It's just that I prefer step sequencing and Qtractor is not a step sequencer :(

Best,

Jeremy

rncbc's picture

as said if you're using a 64 or 128 frames/buffer setting, it might not be an outrageously noticeable effect.

well, it may even get a bit of, say, humanized :)

technically speaking, and depending on the sample rate as well, this all means a 2ms tops for an induced quantization effect--all events aligning to the start time of jack period/process cycles--put in another way, this is maximum jitter in most circumstances.

anyway, i've made some improvement on it today and now it become quite independent of buffer-sizes, sample rates and all but midi resolution (ppqn). also worth mentioning is that the change is only effective when transport is rolling (play is on).

re. svn trunk r1972+ (aka. qtractor 0.4.8.66+)

cheers

Pages

Add new comment