Channel / Source:
TEDx Talks
Published: 2016-10-12
Source: https://www.youtube.com/watch?v=dz_jeuWx3j0
I grew up in Quebec one of the largest French speaking province in Canada and when it was time for me to go to college adds to the honor of being %HESITATION propose a scholarship by and McGill University are often considered by many as one of the most prestigious universities in Canada now Miguel is English speaking and that the time I was concerned that my English wasn't
strong enough and that this would affect my grades so I ended up declining the scholarship and instead going to the invested the mafia %HESITATION which is a good institution but most importantly for me that and it's also French speaking I'm now thankfully my English since then as much improved don't worry %HESITATION but then I I just couldn't help but wonder what I made a horrible mistake
%HESITATION what it turns out that this was the best decision I've ever made because %HESITATION is they need to join one of a handful of labs in the world that %HESITATION we're doing research on then the I. technology known as artificial neural networks so you might not have heard about artificial neural networks but perhaps you've heard of and they I think knowledge you known as deep
learning I declining for instance is behind that technology %HESITATION divorce recognition technology behind many devices such as I Sirianni iPhone or the arms and Alexa %HESITATION any other voice enabled devices well at the core of the burning is the use of artificial neural networks so what artificial neural networks they are computer programs that enable the machine to learn and they are inspired by some of the
computation that goes on in our brain so in real neural networks now consider are the situation on building a machine that can read and writing %HESITATION at the core of artificial neural networks is the artificial neuron you see one here our and much like real neurons artificial neurons are connected so here are we ever know that is connected to the pixels in that image and are
the job and this is much like in fact real neurons that some of our your odds are connected to our retina are in the job of another fish owner on is to detect patterns from its incoming connection now much like real brains which have many many neurons artificial neural networks have many artificial neurons each doing a different thing detecting a different type of pattern and then
finally much lie in real brains which are organized and distinct regions each doing a different jobs or performing a different function and deep artificial neural networks we have multiple layers and this is the core idea behind deep artificial neural networks behind deep learning our and this is meant to mimic some of what we see in the brains of one since the light that hits our retina
will go through multiple different regions of our brain up before it eventually reaches an area with neurons that understand more abstract concepts like there are two points that now this might not be surprising that to develop an artificial intelligence that we might need an artificial brain are in fact as far back as the nineteen eighties and even before then are there were a lot of researchers
performing research on designing better artificial neural networks but by the time that I thought my PhD are that activity at reduced quite a bit and it was only a handful of labs still performing research on artificial neural networks and the reason is that then and there were a lot of other different machine learning methods that seem more successful that simple a I task and in fact
the resort artificial neural networks seem to be mostly successful with simple artificial neural networks with a single layer so kinda like a brain but with just one brain region %HESITATION and in fact there were a lot of researchers that had he says he just given up on the artificial neural networks approach and it wouldn't be uncommon for researchers like me to submit work at conferences and
get reviews that rental but like this where are we to protest rejected just for using artificial neural networks at this isn't an exact quote I couldn't find it back in my emails you can probably magic I didn't care much for it and just got rid of it I'm and yet now fast forward ten years and deep learning is all the rage in that could you know
it's one of the most popular topic of research in industry deep learning technologies are being acquired at the millions of dollars and in the media and press it's often reported as the new way I much like in this piece in Scientific American so what happened well what I thought I did today is %HESITATION give you my perspective on the last ten years in deep learning that
is from its emergence and how it evolved and progress through the years our I wouldn't talk not just about the different technology breakthroughs but also focus a bit on how the community itself involved and progress so for me things really started in two thousand and six are the thing that really influenced my research was this paper by Jeffrey Hinton who you see here from university of
Toronto with that some innocent hero and you I take so in that paper Jeffrey Hinton was proposing a new approach to artificial neural networks and what was really exciting about this work is that it achieved deep artificial neural networks that would rival some of the more standard more popular machine learning methods of the time so this really sparked in New Hope in that the approach using
artificial neural networks might actually be successful for achieving a I'm this was in New Hope so it was new so people is that you come up with a new name to refer to that type of research and they called it the planning the fine stands the next year %HESITATION I Corgan eyes with some of my colleagues the first deep learning workshop it was %HESITATION we tried
organizing that's part of the neural information processing systems conference which is one of the largest machine learning conferences and so we submitted a proposal for the this workshop but it was rejected a however Jeffrey Hinton just wouldn't have it so we put together resources necessary for us to actually organize it as a parallel event it was a huge success success we attracted about ten times as
many people as other official workshops that happen during the conference it was clear there was a lot of excitement in academia for the potential of %HESITATION deep artificial neural networks and in the next three years you started seeing an emergence of more more papers on deep artificial neural networks I referred instead on the name of deep learning now they're well are a lot of people's published
but the progress was relatively slow I turns out that executing artificial neural networks on regular computers is slow and so and about two thousand and ten several different labs figured out a way of executing artificial neural networks not on standard computers but on %HESITATION graphics cards and GPUs the same graphics cards that we use to generate Chris graphics for computer games so this marks for me
the first way major in which there deep learning community has been changing it has become way better at exploiting computational resources what this meant is that a deep learning research lab could this sense the bill it's all me supercomputer but at a few thousand dollars and if fact it's that's your dad Jeffrey intend then is lob a produced the first results suggesting that deep learning my
privilege tonight speech recognition research %HESITATION this denies a big surprise and in fact the speech research community are at kind of a difficulty believing some of these results at least they were harder to publish initially but now deep learning is in that big way present in speech recognition our research and it's also part of the technology like behind Syrian Alexa then %HESITATION into thousand eleven we
start seeing the emergence of a lot of really good high quality softwares for our and libraries for supporting deep learning research like the end of one torch and a few others and to me this marks the second way in which deep learning as we been changing over the years and now has a new dedication towards creating high quality robust easy to use open and free code
libraries to support deep learning research I'm so used to be at our usual neural networks were somewhat difficult to use and implement our but now it's actually quite easy to get started by leveraging the work of other people through these open source library so deep learning community as may perform deep learning research much less like carpentry and much more like playing with legos in two thousand
twelve %HESITATION Jeff into and different in Orem prepara prepares the next revolution with deep learning disarming computer vision so him in his lab %HESITATION participate to a computer vision kayak competition the challenge here is to design a system that can read a photograph and identify what are the objects and animals in this photograph and so the results come in and it turns out that their system
Italy crushes the competition and reach is %HESITATION accuracies that were never seen before not this time this breakthrough was on doubtful and in fact now and computer vision I it is also a field that's are in large part dominated by deep learning methods so in two thousand thirteen there starts being a lot of excitement around the burning methods and that excitement that year is about to
transition to industry and in the back way so for instance are that's here with my colleague from two thousand seven we decide to organize another edition of the deep learning workshop at this time are proposal is accepted our and in facts not just that but we get folk from Facebook that reach out and say that their CEO mark Zuckerberg himself actually wants to be president and
participate so we try to convey how unusual diss I'm organizing and academic workshops and have mark Zuckerberg show up is kinda like organizing a party with your personal friends and then we'll look at that mark Zuckerberg is here %HESITATION this is totally a surprise and and not just out for someone like me who use to do research initially my PhD and I could barely get my
colleagues in other topics of machine learning interest in artificial neural networks is is almost beyond comprehension I am in fact the into the interest from industry is as high as ever and also at that workshop we see the first demonstration by a little known %HESITATION start up beat my technologies of first version of their system that is able to play Atari games at the level of
humans and in fact less than a year later %HESITATION decline was acquired by Google %HESITATION also that year two thousand thirteen D. I international conference on learning representations is are created I had the honor of a court hearing that conference in the past two years and I mention it for two reasons the first is that this conference is now mostly known as the deep learning conference
and so that means I need two thousand thirteen the community is big enough a vibrant enough that it can sustain its own conference R. D. other reason the most important reason is that this conference as a very unique reviewing model for scientific work our authors are asked to submit their work publicly right away on a website known as archive dot org and so now the work
is it accessible for everyone and then to hold people in the community is invited to review and criticize this work right away for everyone to see so to me this marks the third way in which deep learning community has been changing and evolving over time it aggressively promotes the discussion and ID open our our criticism of our deep learning results are %HESITATION and now in fact
this approach up as soon as you have results that can be presented to put it on archive and then discuss it openly on social media for instance is a vastly adopted by deep learning researchers instead of waiting for to seal of approval from conferences that journey so this is great for science we get to iterate over ideas much more rapidly is not so great for scientists
because any day you can be a day where you discovered that some other lab as executed the research idea you want to work on I'm then in two thousand fourteen we start seeing a declining systems are very good with tax so for instance are we see first examples of deep learning systems I successfully performing machine translation so taking in a sentence in a foreign language and
producing an English translation are we also see systems and instead take and or reads an image and produces an English description of what that image is and this is a really interesting example because that's here in a few months four different labs proposed more less the same idea at about exactly the same time independently so this really illustrates how rapid innovation becomes at this time our
thanks to GPUs to graphics cards and thanks to really good open source software we get to iterate and and produce results very rapidly and those are communicated almost immediately for everyone to die digest and and and and %HESITATION dissect laying the groundwork for the next innovation in two thousand fifteen we start seeing the burning systems that instead of perceiving or are taking as input Sunday and
I'm making some predictions actually can generate or synthesize visual are content seven example here of announcement the neural style transfer out of them based on deep learning that can read a picture photograph and also a painting and produce a painting of that photograph using the style of the painting that was provided but also we're now seeing a lot of work on generating entirely new visual content
unless I can this work from open the I. R. reaching levels of realism we haven't seen before and this goes even beyond visual content where we're seeing for instance are a reason for by Google deep mine on generating %HESITATION ideal as was generating speech and generating music are and also we've seen in two thousand sixteen perhaps you've heard about this deicide deep frown Twitter but that's
powered by deep learning where I deep learning system was trained on Donald trump's tweets and was able to generate new two weeks that might as well have come from from him %HESITATION sites now that is Mike make it sound easier that it was to achieve but there's actually an impressive %HESITATION but two thousand sixteen will almost certainly be remembered as the year that Google present and
they're also go system and which competed against one of the world's best go player at least it'll and our IT one and in fact this came as a big surprise for many in the community %HESITATION many expected it would take many more years to actually achieve this our but today I also go months it's human peers is recognized as the second best goal player in the
world so we went from deep learning systems that can take as input and image and and protects simple symbols in it to deep learning systems that can both perceive and synthesize very complex content much like our photographs speech tax or game strategies we've come a long way but there's still a long way to go before we reach to react and I'm quite optimistic that deep learning
will play an important role in that quest are not just because deep learning technology is powerful but also anyone to leave you with this because the deep learning community as restructured itself to facilitate innovation very quickly it has done this by first becoming much better exploiting computation resources using graphics card %HESITATION has become better at producing tools for performing deep learning research with very high quality
open source code libraries and I've become really good at are discussing and %HESITATION sharing information about how to do the planning and also what is the current state of the art while the recent breakthroughs and opening up the discussion to everyone I'm we've got a long way in these three aspects since I've done my PhD are and I think we can go even further we're starting