Welcome, Guest. Please login or register.

Login with username, password and session length

 
Advanced search

1411281 Posts in 69324 Topics- by 58380 Members - Latest Member: bob1029

March 28, 2024, 11:34:05 PM

Need hosting? Check out Digital Ocean
(more details in this thread)
TIGSource ForumsDeveloperAudioHumanizing Virtual Instruments
Pages: [1]
Print
Author Topic: Humanizing Virtual Instruments  (Read 2669 times)
Pete301
Level 0
***


View Profile
« on: April 18, 2015, 05:38:44 AM »

Hey all,

I thought I'd try and get to learning how to correctly/effectively humanize virtual instruments.

I have made quite a few tracks now with virtual instruments, but whenever I do I seem to find that the more Virtual Instruments I add to the music the less life that it gives the track. I think I can nail it down to a few factors...

1. The Virtual Instruments I have are not that great, maybe I should invest in higher quality if I wish to compose for purely virtual instruments?

2. I am not using the Virtual Instruments to their best potential.

3. My composition skills are not fantastic and the composition itself needs to become less robotic/on the beat.

To solve these I'd like the help of you guys.

So the 2 questions I'd like to ask are:

What instruments do you use and would trust to sound 'human-like'?

Are there any tips on how to get the most out of your instruments?

I'm talking purely about real instruments here, not like synths and what not.
Logged
ZackParrish
Level 4
****



View Profile WWW
« Reply #1 on: April 18, 2015, 06:39:17 AM »

Part of the humanization of a track involves removing the perfection of the performance from the track. Granted, a nicer sample library will generally help with this in some cases but things like a piano... repetitive staccato parts in other instruments, even with round robin it is easy for something to sound robotic and fake. Making good use of velocity shifts(assuming the samples you use have multiple layers of velocity of the instrument recorded), and modwheel/expression shifts will help tremendously.

Percussive instruments are often robotic sounding as well and are notorious for creating the 'machine gun effect'. That's where someone will have say... a snare part that is the exact same velocity on every note. Round robin helps here a bit too but even then it's still not fool proof. Accenting some notes while deadening others and also shifting the occasional note a few ticks to the left or right will put in the human error that is ever so subtle but always there. Humans aren't perfect and to really humanize a track the performance of your computer music be imperfect.

One feature I really like my sample libraries to have is true legato... even as big of a pain as it can be to use sometimes it helps tremendously with realism. The trouble it creates is due to the small delay between the start and end note caused by the legato shift. In most cases you will have to shift the ending note slightly to the left in order for it to be in sync with the rest of the track.

Just don't assume that higher quality sample libraries are easier to humanize because they always sound better in demo tracks. Some of them take a bit of time and patience to get the sound just right. You really have to pay close attention to detail when using them, and the more you practice trying to make a lower end library sound real the less of a nightmare it'll be to make a higher end library sound authentic.

Was going to answer your two questions directly but I'm not sure in what context you are using the word instrument. I can play piano, and I know the nuances required to make a virtual piano sound like someone is actually playing it... (and a cheap piano usually does a poor job helping with this). Good practice for humanizing is to listen to real instruments, particularly the ones you are using and then try to replicate the same sound in your music. Also be prepared to have your world shattered when you get to a point where your humanization is up there with all the audio demos you hear on sample libraries... because then the 'sample' aspect of them is all too clear and it's very easy to distinguish between real and fake. Mixing in real instrumentation with the fake helps to remove this fakeness from the piece though, so if I can afford it I'll try to hire someone to perform a key part in a track just to add at least ONE part that isn't sampled, but using a real performance.
Logged

FelixArifin
Level 1
*


View Profile WWW
« Reply #2 on: April 18, 2015, 08:10:10 AM »

^ Everything that guy said, plus a little extra tidbit: Play around with reverb! It helps a little bit with the legato effect that you'll be looking for.

- Felix
Logged

Barendhoff
Level 1
*



View Profile WWW
« Reply #3 on: April 18, 2015, 11:26:50 AM »

A good trick for making your tracks sound more realistic, is to automate the crap out of the VST's volumes, especially sustain sections. When playing an actual instrument, controlling volume comes naturally with practice. When playing with a VST, not so much. Keep in mind that VST developers can't predict how exactly you're going to play the instrument, as there are too many different possibilities for users to explore, even within a single playstyle. So they trust users to use the various modifiers to get the exact sound they want. This is easily forgotten and basic volume automation is particularly easy to miss. Doing volume automation correctly can introduce great subtlety to your tracks and eliminate monotony. So I usually customize the ADSR by automizing the volume of my sustain section.

As Zack mentioned, velocity should not be forgotten. Real instrument players don't tend to maintain the exact same intensity of striking whatever instrument they play, so don't do that in virtual orchestration either. Same goes for rhythm: while keeping a robotically steady tempo makes it easier to orchestrate in your DAW, introducing tempo increases and decreases helps to make the piece come to life.

Also, when you reach a resolving chord (sus4 to major, to name a basic example), it is tempting to maintain the volume or increase it when the resolving chord hits. Try to reduce the volume instead: introduce a fade out right there. The effect can be quite...poignant, and I like to think it introduces a human touch to the virtual instrument.

As for your question which instruments I trust to sound human, I'm quite pleased with 8Dio's instruments and often use VST's from Native Instruments. Still, however, I find myself using different libraries, even within a single track; the differences in timbre and intensity can be used to simulate the difference between individual viola's in an actual viola section, for example. And sometimes, one library's legato sounds better than one other's, which in turn has a nicer staccato. As long as the libraries aren't worlds apart and you can keep the overall sound coherent, it's perfectly fine to use different VST's for a single instrument's part in the track.

Generally, more expensive VST's tend to sound better than cheaper ones--but they're only as good as the user's ability to handle them. A cheap VST that is put to use properly can still sound a lot better than an expensive VST that is used poorly.
Logged

MoritzPGKatz
Level 3
***


"Was he an animal, that music could move him so?"


View Profile WWW
« Reply #4 on: April 19, 2015, 01:14:44 PM »

Hello,

The best "humanizing" effect is recording an actual performance and not just programming everything.

It helps if there are knobs you can wiggle, e.g. using a string library which expression, vibrato etc. can be modulated (EWQL, Spitfire, LASS...).
I've found that it's often much faster and easier to achieve great sounding takes if you do these things either on the fly or with a second MIDI overdub on top. Of course I'll always tweak a couple things and quantize quite a few things at least a bit.

It also helps if you can play the damn MIDI controller in front of you and not just use it to step notes in on the piano roll.

Even better than trying to get your sampled sounds to sound natural, and I agree with Zack here, is to record stuff yourself.
For example, a single well recorded and well versed solo string player can make a whole otherwise purely sample-based orchestra track sound that much more believable.

And it doesn't even have to be that sophisticated: Every musician should be able to play 8ths or 16ths with a bloody shaker egg (or at least well enough to cut it).
And just having a nice little recording spot, an SM57 and a box-o-percussion will set you apart from the people who just drag in the ever same Apple Loop or use the ever same cracked sample libraries.

Cheers,
Moritz
Logged

Arcadian Atlas now on Steam!
>120 minute jazz OST on my Bandcamp
Vinyl pre-orders available
Head of Music at German Wahnsinn Studios
dawid w. mika
Level 0
***



View Profile
« Reply #5 on: April 19, 2015, 09:40:55 PM »

+1 for Shure sm57! This little thing still amazes me and contrary to some newbie engineers claims, dynamic mics are useful (especially with decent preamp) and one does not necessary need to use expensive condenser microphones to achieve something usable. For some audi osources it can even sound better. Just take a look at this guy:



As for humanizing vsti's, apart from everything good that has been said here, sometimes I'm trying to make some of vsti's sound nasty by routing them through bus tracks with heavy saturation. It really helps to mix together vsti's that are either too perfect and sterile or created by different companies and don't have anything in common. With this approach you can also blend together poor vsti's with good ones by treating them as a "lo-fi" element in your mix.

Here you can get really nice and simple saturation plugin for free: http://www.softube.com/index.php?id=satknob
« Last Edit: April 20, 2015, 05:27:08 AM by dawid w. mika » Logged

Jasmine
Level 5
*****

Boop


View Profile WWW
« Reply #6 on: April 20, 2015, 10:34:06 AM »

sometimes I'm trying to make some of vsti's sound nasty by routing them through bus tracks with heavy saturation. It really helps to mix together vsti's that are either too perfect and sterile or created by different companies and don't have anything in common. With this approach you can also blend together poor vsti's with good ones by treating them as a "lo-fi" element in your mix.

This sounds super interesting. Do you have an example of this?
Logged

dawid w. mika
Level 0
***



View Profile
« Reply #7 on: April 21, 2015, 12:20:56 AM »

I don't have anything where it's easy to be heard but I'm just working on something with a few stock kontakt instruments that are quite sterile so maybe I'll try to record a little tutorial about mixing vsti's.
Logged

WittyNotes
Level 0
**


View Profile WWW
« Reply #8 on: April 25, 2015, 07:58:05 AM »

Agreed with what everybody has added in so far, but here are some random tips and tricks to keep in mind.


1. Some libraries have nice humanize functions. I use Vienna Instruments for my strings and winds, and they have a nice pitch & delay humanization.

2. As others have said, recording the lines in yourself, even if it's a simple beat-per-measure line, will help greatly.

3. In Logic at least, there are velocity randomizers with a lovely degree over control that help spice up the lines. I'd guess that most DAWS have something similar.

4. Sometimes, I "stagger" held lines (ie, I make them wave up and down slowly in volume), just so when several lines are stacked together, the ear doesn't receive one constant sound.

Just some ideas...
Logged
dbfs
Level 0
*


View Profile
« Reply #9 on: April 28, 2015, 11:40:45 AM »

This echoes a lot of what other people have already said:

Pay attention to your note-ons and -offs. Definitely shift the start of notes to get them syncing up with the beat well, but pay attention to the end of lines (this doesn't really apply to staccato notes). Notes usually end with a little decrescendo, not just a complete cutoff (just like they don't usually start completely on except for sfz/staccato stuff- they have a short, small crescendo to full volume).

Offset the start of drum hits a tiny amount from each other- real players are not synced 100%. This will make your impacts sound bigger and more realistic.

Take into account when performers will need to breathe. Having horns or woodwinds playing constantly for 60 seconds is not realistic (this is only if you're going for super realism; it's very important when playing only a few instruments, less so when the whole orchestra is banging away).

Ride your velocity and any cc controls you have constantly- never let them stay stagnant, musicians never do. Make your music breathe.

Don't be afraid to ride the mix of your instruments. I use Spitfire Albion and will ride the close/tree/ambient/outrigger mic mixes during a piece, as well as reverbs on groups and 2bus.

Don't be afraid to load up an unrealistic amount of groups- I use several string sections from Albion to play different lines. Without them the orchestra sounds too fake and empty.

Play all your lines in yourself (even if you go back and quantize them later). Slow the piece down by 50% if you have to, just record it yourself.

Logged
Pages: [1]
Print
Jump to:  

Theme orange-lt created by panic