1. Po raz pierwszy odwiedzasz EDU. LEARN

    Odwiedzasz EDU.LEARN

    Najlepszym sposobem na naukę języka jest jego używanie. W EDU.LEARN znajdziesz interesujące teksty i videa, które dadzą Ci taką właśnie możliwość. Nie przejmuj się - nasze filmiki mają napisy, dzięki którym lepiej je zrozumiesz. Dodatkowo, po kliknięciu na każde słówko, otrzymasz jego tłumaczenie oraz prawidłową wymowę.

    Nie, dziękuję
  2. Mini lekcje

    Podczas nauki języka bardzo ważny jest kontekst. Zdjęcia, przykłady użycia, dialogi, nagrania dźwiękowe - wszystko to pomaga Ci zrozumieć i zapamiętać nowe słowa i wyrażenia. Dlatego stworzyliśmy Mini lekcje. Są to krótkie lekcje, zawierające kontekstowe slajdy, które zwiększą efektywność Twojej nauki. Są cztery typy Mini lekcji - Gramatyka, Dialogi, Słówka i Obrazki.

    Dalej
  3. Wideo

    Ćwicz język obcy oglądając ciekawe filmiki. Wybierz temat, który Cię interesuje oraz poziom trudności, a następnie kliknij na filmik. Nie martw się, obok każdego z nich są napisy. A może wcale nie będą Ci one potrzebne? Spróbuj!

    Dalej
  4. Teksty

    Czytaj ciekawe artykuły, z których nauczysz się nowych słówek i dowiesz więcej o rzeczach, które Cię interesują. Podobnie jak z filmikami, możesz wybrać temat oraz poziom trudności, a następnie kliknąć na wybrany artykuł. Nasz interaktywny słownik pomoże Ci zrozumieć nawet trudne teksty, a kontekst ułatwi zapamiętanie słówek. Dodatkowo, każdy artykuł może być przeczytany na głos przez wirtualnego lektora, dzięki czemu ćwiczysz słuchanie i wymowę!

    Dalej
  5. Słowa

    Tutaj możesz znaleźć swoją listę "Moje słówka", czyli funkcję wyszukiwania słówek - a wkrótce także słownik tematyczny. Do listy "Moje słówka" możesz dodawać słowa z sekcji Videa i Teksty. Każde z słówek dodanych do listy możesz powtórzyć później w jednym z naszych ćwiczeń. Dodatkowo, zawsze możesz iść do swojej listy i sprawdzić znaczenie, wymowę oraz użycie słówka w zdaniu. Użyj naszej wyszukiwarki słówek w części "Słownictwo", aby znaleźć słowa w naszej bazie.

    Dalej
  6. Lista tekstów

    Ta lista tekstów pojawia się po kliknięciu na "Teksty". Wybierz poziom trudności oraz temat, a następnie artykuł, który Cię interesuje. Kiedy już zostaniesz do niego przekierowany, kliknij na "Play", jeśli chcesz, aby został on odczytany przez wirtualnego lektora. W ten sposób ćwiczysz umiejętność słuchania. Niektóre z tekstów są szczególnie interesujące - mają one odznakę w prawym górnym rogu. Koniecznie je przeczytaj!

    Dalej
  7. Lista Video

    Ta lista filmików pojawia się po kliknięciu na "Video". Podobnie jak w przypadku Tekstów, najpierw wybierz temat, który Cię interesuje oraz poziom trudności, a następnie kliknij na wybrane video. Te z odznaką w prawym górnym rogu są szczególnie interesujące - nie przegap ich!

    Dalej
  8. Dziękujemy za skorzystanie z przewodnika!

    Teraz już znasz wszystkie funkcje EDU.LEARN! Przygotowaliśmy do Ciebie wiele artykułów, filmików oraz mini lekcji - na pewno znajdziesz coś, co Cię zainteresuje!

    Teraz zapraszamy Cię do zarejestrowania się i odkrycia wszystkich możliwości portalu.

    Dziękuję, wrócę później
  9. Lista Pomocy

    Potrzebujesz z czymś pomocy? Sprawdź naszą listę poniżej:
    Nie, dziękuję

Już 62 351 użytkowników uczy się języków obcych z Edustation.

Możesz zarejestrować się już dziś i odebrać bonus w postaci 10 monet.

Jeżeli chcesz się dowiedzieć więcej o naszym portalu - kliknij tutaj

Jeszcze nie teraz

lub

Poziom:

Wszystkie

Nie masz konta?

Anders Ynnerman: Visualizing the medical data explosion


Poziom:

Temat: Nauka i technologia

I will start by posing a little bit of a challenge,
the challenge of dealing with data,
data that we have to deal with
in medical situations.
It's really a huge challenge for us.
And this is our beast of burden.
This is a computer tomography machine --
a CT machine.
It's a fantastic device.
It uses X-rays, X-ray beams,
that are rotating very fast around the human body.
It takes about 30 seconds to go through the whole machine
and is generating enormous amounts of information
that comes out of the machine.
So this is a fantastic machine
that we can use
for improving health care.
But as I said, it's also a challenge for us.
And the challenge is really found in this picture here.
It's the medical data explosion
that we're having right now.
We're facing this problem.
And let me step back in time.
Let's go back a few years in time and see what happened back then.
These machines that came out --
they started coming in the 1970s --
they would scan human bodies,
and they would generate about 100 images
of the human body.
And I've taken the liberty, just for clarity,
to translate that to data slices.
That would correspond to about 50 MB of data,
which is small
when you think about the data we can handle today
just on normal mobile devices.
If you translate that to phone books,
it's about one meter of phone books in the pile.
Looking at what we're doing today
with these machines that we have,
we can, just in a few seconds,
get 24,000 images out of a body.
And that would correspond to about 20 GB of data,
or 800 phone books.
And the pile would then be 200 meters of phone books.
What's about to happen --
and we're seeing this, it's beginning --
a technology trend that's happening right now
is that we're starting to look at time result situations as well.
So we're getting the dynamics out of the body as well.
And just assume
that we will be collecting data during five seconds,
and that would correspond to one terabyte of data.
That's 800,000 books
and 16 km of phone books.
That's one patient, one data set.
And this is what we have to deal with.
So this is really the enormous challenge that we have.
And already today -- this is 25,000 images.
Imagine the days
when we had radiologists doing this.
They would put up 25,000 images,
they would go like this, "25,0000, okay, okay.
There is the problem."
They can't do that anymore; that's impossible.
So we have to do something that's a little bit more intelligent than doing this.
So what we do is we put all these slices together.
Imagine that you slice your body in all these directions,
and then you try to put the slices back together again
into a pile of data, into a block of data.
So this is really what we're doing.
So this gigabyte or terabyte of data, we're putting it into this block.
But of course, the block of data
just contains the amount of X-ray
that's been absorbed in each point in the human body.
So what we need to do is to figure out a way
of looking at the things we do want to look at
and make things transparent that we don't want to look at.
So transforming the data set
into something that looks like this.
And this is a challenge.
This is a huge challenge for us to do that.
Using computers, even though they're getting faster and better all the time,
it's a challenge to deal with gigabytes of data,
terabytes of data
and extracting the relevant information.
I want to look at the heart,
I want to look at the blood vessels, I want to look at the liver,
maybe even find a tumor
in some cases.
So this is where this little dear comes into play.
This is my daughter.
This is as of 9:00 am this morning.
She's playing a computer game.
She's only two years old,
and she's having a blast.
So she's really the driving force
behind the development of graphics processing units.
As long as kids are playing computer games,
graphics is getting better and better and better.
So please go back home, tell your kids to play more games,
because that's what I need.
So what's inside of this machine
is what enables me to do the things that I'm doing
with the medical data.
So really what I'm doing is using these fantastic little devices.
And you know, going back
maybe 10 years in time
when I got the funding
to buy my first graphics computer.
It was a huge machine.
It was cabinets of processors and storage and everything.
I paid about one million dollars for that machine.
That machine is, today, about as fast as my iPhone.
So every month there are new graphics cards coming out.
And here is a few of the latest ones from the vendors --
NVIDIA, ATI, Intel is out there as well.
And you know, for a few hundred bucks
you can get these things and put them into your computer,
and you can do fantastic things with these graphics cards.
So this is really what's enabling us
to deal with the explosion of data in medicine,
together with some really nifty work
in terms of algorithms --
compressing data,
extracting the relevant information that people are doing research on.
So I'm going to show you a few examples of what we can do.
This is a data set that was captured using a CT scanner.
You can see that this is a full data.
It's a woman. You can see the hair.
You can see the individual structures of the woman.
You can see that there is scattering of X-rays
on the teeth, the metal in the teeth.
That's where those artifacts are coming from.
But fully interactively
on standard graphics cards on a normal computer,
I can just put in a clip plane.
And of course all the data is inside,
so I can start rotating, I can look at it from different angles,
and I can see that this woman had a problem.
She had a bleeding up in the brain,
and that's been fixed with a little stent,
a metal clamp that's tightening up the vessel.
And just by changing the functions,
then I can decide what's going to be transparent
and what's going to be visible.
I can look at the skull structure,
and I can see that, okay, this is where they opened up the skull on this woman,
and that's where they went in.
So these are fantastic images.
They're really high resolution,
and they're really showing us what we can do
with standard graphics cards today.
Now we have really made use of this,
and we have tried to squeeze a lot of data
into the system.
And one of the applications that we've been working on --
and this is gotten a little bit of traction worldwide --
is the application of virtual autopsies.
So again, looking at very, very large data sets,
and you saw those full-body scans that we can do.
We're just pushing the body through the whole CT scanner,
and just in a few seconds we can get a full-body data set.
So this is from a virtual autopsy.
And you can see how I'm gradually peeling off.
First you saw the body bag that the body came in,
then I'm peeling off the skin -- you can see the muscles --
and eventually you can see the bone structure of this woman.
Now at this point, I would also like to emphasize
that, with the greatest respect
for the people that I'm now going to show --
I'm going to show you a few cases of virtual autopsies --
so it's with great respect for the people
that have died under violent circumstances
that I'm showing these pictures to you.
In the forensic case --
and this is something
that there's been approximately 400 cases so far
just in the part of Sweden that I come from
that has been undergoing virtual autopsies
in the past four years.
So this will be the typical work-flow situation.
The police will decide --
in the evening, when there's a case coming in --
they will decide, okay, is this a case where we need to do an autopsy.
So in the morning, in between six and seven in the morning,
the body is then transported inside the body bag
to our center
and is being scanned through one the the CT scanners.
And then the radiologist, together with the pathologist
and sometimes the forensic scientist,
looks at the data that's coming out,
and they have a joint session.
And then they decide what to do in the real physical autopsy after that.
Now looking at a few cases,
here's one of the first cases that we had.
You can really see the details of the data set;
it's very high-resolution.
And it's our algorithms that allow us
to zoom in on all the details.
And again, it's fully interactive,
so you can rotate and you can look at things in real time
on these systems here.
Without saying too much about this case,
this is a traffic accident,
a drunk driver hit a woman.
And it's very, very easy to see the damages on the bone structure.
And the cause of death is the broken neck.
And this women also ended up under the car,
so she's quite badly beaten up
by this injury.
Here's another case, a knifing.
And this is also again showing us what we can do.
It's very easy to look at metal artifacts
that we can show inside of the body.
You can also see some of the artifacts from the teeth --
that's actually the filling of the teeth --
but because I've set the functions to show me metal
and make everything else transparent.
Here's another violent case. This really didn't kill the person.
The person was killed by stabs in the heart,
but they just deposited the knife
by putting it through one of the eyeballs.
Here's another case.
It's very interesting for us
to be able to look at things like knife stabbings.
Here you can see that knife went through the heart.
It's very easy to see how air has been leaking
from one part to another part,
which is difficult to do in a normal, standard, physical autopsy.
So it really, really helps
the criminal investigation
to establish the cause of death,
and in some cases also directing the investigation in the right direction
to find out who the killer really was.
Here's another case that I think is interesting.
Here you can see a bullet
that has lodged just next to the spine on this person.
And what we've done is that we've turned the bullet into a light source,
so that bullet is actually shining,
and it makes it really easy to find these fragments.
During a physical autopsy,
if you actually have to dig through the body to find these fragments,
that's actually quite hard to do.
One of the things that I'm really, really happy
to be able to show you here today
is our virtual autopsy table.
It's a touch device that we have developed
based on these algorithms, using standard graphics GPU's.
It actually looks like this,
just to give you a feeling for what it looks like.
It really just works like a huge iPhone.
So we've implemented
all the gestures you can do on the table,
and you can think of it as an enormous touch interface.
So if you were thinking of buying an iPad,
forget about it; this is what you want instead.
Steve, I hope you're listening to this, all right.
So it's a very nice little device.
So if you have the opportunity, please try it out.
It's really a hands-on experience.
So it's gained some traction, and we're trying to roll this out
and trying to use it for educational purposes,
but also, perhaps in the future,
in a more clinical situation.
There's a YouTube that you can download and look at this,
if you want to convey the information to other people
about virtual autopsies.
Okay, now that we're talking about touch,
let me move on to really touching data.
And this is a bit of science fiction now,
so we're moving into really the future.
This is not really what the medical doctors are using right now,
but I hope they will in the future.
So what you're seeing on the left is a touch device.
It's a little mechanical pen
that has very, very fast step motors inside of the pen.
And so I can generate a force feedback.
So when I virtually touch data,
it will generate touch forces in the pen, so I get a feedback.
So in this particular situation,
it's a scan of a living person.
I have this pen, and I look at the data,
and I move the pen towards the head,
and all of a sudden I feel resistance.
So I can feel the skin.
If I push a little bit harder, I'll go through the skin,
and I can feel the bone structure inside.
If I push even harder, I'll go through the bone structure,
especially close to the ear where the bone is very soft.
And then I can feel the brain inside, and this will be the slushy like this.
So this is really nice.
And to take that even further, this is a heart.
And this is also due to these fantastic new scanners,
that just in 0.3 seconds,
I can scan the whole heart,
and I can do that with time resolution.
So just looking at this heart,
I can play back a video here.
And this is Karljohan, one of my graduate students
who's been working on this project.
And he's sitting there in front of the Haptic device, the force feedback system,
and he's moving his pen towards the heart,
and the heart is now beating in front of him,
so he can see how the heart is beating.
He's taken the pen, and he's moving it towards the heart,
and he's putting it on the heart,
and then he feels the heartbeats from the real living patient.
Then he can examine how the heart is moving.
He can go inside, push inside of the heart,
and really feel how the valves are moving.
And this, I think, is really the future for heart surgeons.
I mean it's probably the wet dream for a heart surgeon
to be able to go inside of the patient's heart
before you actually do surgery,
and do that with high-quality resolution data.
So this is really neat.
Now we're going even further into science fiction.
And we heard a little bit about functional MRI.
Now this is really an interesting project.
MRI is using magnetic fields
and radio frequencies
to scan the brain, or any part of the body.
So what we're really getting out of this
is information of the structure of the brain,
but we can also measure the difference
in magnetic properties of blood that's oxygenated
and blood that's depleted of oxygen.
That means that it's possible
to map out the activity of the brain.
So this is something that we've been working on.
And you just saw Motts the research engineer there
going into the MRI system,
and he was wearing goggles.
So he could actually see things in the goggles.
So I could present things to him while he's in the scanner.
And this is a little bit freaky,
because what Motts is seeing is actually this.
He's seeing his own brain.
So Motts is doing something here.
And probably he is going like this with his right hand,
because the left side is activated
on the motor cortex.
And then he can see that at the same time.
These visualizations are brand new.
And this is something that we've been researching for a little while.
This is another sequence of Motts' brain.
And here we asked Motts to calculate backwards from 100.
So he's going "100, 97, 94."
And then he's going backwards.
And you can see how the little math processor is working up here in his brain
and is lighting up the whole brain.
Well this is fantastic. We can do this in real time.
We can investigate things. We can tell him to do things.
You can also see that his visual cortex
is activated in the back of the head,
because that's where he's seeing, he's seeing his own brain.
And he's also hearing our instructions
when we tell him to do things.
The signal is really deep inside of the brain as well,
but it's shining through,
because all of the data is inside this volume.
And in just a second here you will see --
Okay, here. Motts, now move your left foot.
So he's going like this.
For 20 seconds he's going like that,
and all of a sudden it lights up up here.
So we've got motor cortex activation up there.
So this is really, really nice.
And I think this is a great tool.
And connecting also with the previous talk here,
this is something that we could use as a tool
to really understand
how the neurons are working, how the brain is working,
and we can do this with very, very high visual quality
and very fast resolution.
Now we're also having a bit of fun at the center.
So this is a CAT scan -- computer aided tomography.
So this is a lion from the local zoo
outside of Norrkoping in Kolmarden, Elsa.
So she came to the center,
and they sedated her
and then put her straight into the scanner.
And then, of course, I get the whole data set from the lion.
And I can do very nice images like this.
I can peel off the layer of the lion.
I can look inside of it.
And we've been experimenting with this.
And I think this is a great application
for the future of this technology.
Because there's very little known about the animal anatomy.
What's known out there for veterinarians is kind of basic information.
We can scan all sorts of things,
all sorts of animals.
The only problem is to fit it into the machine.
So here's a bear.
It was kind of hard to get it in.
And the bear is a cuddly, friendly animal.
And here it is. Here is the nose of the bear.
And you might want to cuddle this one,
until you change the functions and look at this.
So be aware of the bear.
So with that,
I'd like to thank all the people
that have helped me to generate these images.
It's a huge effort that goes into doing this,
gathering the data and developing the algorithms,
writing all the software.
So, some very talented people.
My motto is always, I only hire people that are smarter than I am
and most of these are smarter than I am.
So thank you very much.
(Applause)
Mobile Analytics