|
WEBVTT |
|
|
|
00:00:01.280 --> 00:00:06.759 |
|
so the class today is uh introduction to |
|
|
|
00:00:04.680 --> 00:00:09.480 |
|
natural language processing and I'll be |
|
|
|
00:00:06.759 --> 00:00:11.200 |
|
talking a little bit about you know what |
|
|
|
00:00:09.480 --> 00:00:14.719 |
|
is natural language processing why we're |
|
|
|
00:00:11.200 --> 00:00:16.720 |
|
motivated to do it and also some of the |
|
|
|
00:00:14.719 --> 00:00:18.039 |
|
difficulties that we encounter and I'll |
|
|
|
00:00:16.720 --> 00:00:19.880 |
|
at the end I'll also be talking about |
|
|
|
00:00:18.039 --> 00:00:22.519 |
|
class Logistics so you can ask any |
|
|
|
00:00:19.880 --> 00:00:25.439 |
|
Logistics questions at that |
|
|
|
00:00:22.519 --> 00:00:27.720 |
|
time so if we talk about what is NLP |
|
|
|
00:00:25.439 --> 00:00:29.320 |
|
anyway uh does anyone have any opinions |
|
|
|
00:00:27.720 --> 00:00:31.439 |
|
about the definition of what natural |
|
|
|
00:00:29.320 --> 00:00:33.239 |
|
language process would be oh one other |
|
|
|
00:00:31.439 --> 00:00:35.680 |
|
thing I should mention is I am recording |
|
|
|
00:00:33.239 --> 00:00:38.600 |
|
the class uh I put the class on YouTube |
|
|
|
00:00:35.680 --> 00:00:40.520 |
|
uh afterwards I will not take pictures |
|
|
|
00:00:38.600 --> 00:00:41.920 |
|
or video of any of you uh but if you |
|
|
|
00:00:40.520 --> 00:00:44.719 |
|
talk your voice might come in the |
|
|
|
00:00:41.920 --> 00:00:47.440 |
|
background so just uh be aware of that |
|
|
|
00:00:44.719 --> 00:00:49.000 |
|
um usually not it's a directional mic so |
|
|
|
00:00:47.440 --> 00:00:51.559 |
|
I try to repeat the questions after |
|
|
|
00:00:49.000 --> 00:00:54.079 |
|
everybody um but uh for the people who |
|
|
|
00:00:51.559 --> 00:00:57.680 |
|
are recordings uh listening to the |
|
|
|
00:00:54.079 --> 00:00:59.320 |
|
recordings um so anyway what is NLP |
|
|
|
00:00:57.680 --> 00:01:03.120 |
|
anyway does anybody have any ideas about |
|
|
|
00:00:59.320 --> 00:01:03.120 |
|
the definition of what NLP might |
|
|
|
00:01:06.119 --> 00:01:09.119 |
|
be |
|
|
|
00:01:15.439 --> 00:01:21.759 |
|
yes okay um it so the answer was it |
|
|
|
00:01:19.240 --> 00:01:25.759 |
|
helps machines understand language |
|
|
|
00:01:21.759 --> 00:01:27.920 |
|
better uh so to facilitate human human |
|
|
|
00:01:25.759 --> 00:01:31.159 |
|
and human machine interactions I think |
|
|
|
00:01:27.920 --> 00:01:32.759 |
|
that's very good um it's |
|
|
|
00:01:31.159 --> 00:01:36.520 |
|
uh similar to what I have written on my |
|
|
|
00:01:32.759 --> 00:01:38.040 |
|
slide here uh but natur in addition to |
|
|
|
00:01:36.520 --> 00:01:41.280 |
|
natural language understanding there's |
|
|
|
00:01:38.040 --> 00:01:46.000 |
|
one major other segment of NLP uh does |
|
|
|
00:01:41.280 --> 00:01:46.000 |
|
anyone uh have an idea what that might |
|
|
|
00:01:48.719 --> 00:01:53.079 |
|
be we often have a dichotomy between two |
|
|
|
00:01:51.399 --> 00:01:55.240 |
|
major segments natural language |
|
|
|
00:01:53.079 --> 00:01:57.520 |
|
understanding and natural language |
|
|
|
00:01:55.240 --> 00:01:59.439 |
|
generation yeah exactly so I I would say |
|
|
|
00:01:57.520 --> 00:02:03.119 |
|
that's almost perfect if you had said |
|
|
|
00:01:59.439 --> 00:02:06.640 |
|
understand and generate so very good um |
|
|
|
00:02:03.119 --> 00:02:08.560 |
|
so I I say natural technology to handle |
|
|
|
00:02:06.640 --> 00:02:11.400 |
|
human language usually text using |
|
|
|
00:02:08.560 --> 00:02:13.200 |
|
computers uh to Aid human machine |
|
|
|
00:02:11.400 --> 00:02:15.480 |
|
communication and this can include |
|
|
|
00:02:13.200 --> 00:02:17.879 |
|
things like question answering dialogue |
|
|
|
00:02:15.480 --> 00:02:20.840 |
|
or generation of code that can be |
|
|
|
00:02:17.879 --> 00:02:23.239 |
|
executed with uh |
|
|
|
00:02:20.840 --> 00:02:25.080 |
|
computers it can also Aid human human |
|
|
|
00:02:23.239 --> 00:02:27.440 |
|
communication and this can include |
|
|
|
00:02:25.080 --> 00:02:30.440 |
|
things like machine translation or spell |
|
|
|
00:02:27.440 --> 00:02:32.640 |
|
checking or assisted writing |
|
|
|
00:02:30.440 --> 00:02:34.560 |
|
and then a final uh segment that people |
|
|
|
00:02:32.640 --> 00:02:37.400 |
|
might think about a little bit less is |
|
|
|
00:02:34.560 --> 00:02:39.400 |
|
analyzing and understanding a language |
|
|
|
00:02:37.400 --> 00:02:42.400 |
|
and this includes things like syntactic |
|
|
|
00:02:39.400 --> 00:02:44.959 |
|
analysis text classification entity |
|
|
|
00:02:42.400 --> 00:02:47.400 |
|
recognition and linking and these can be |
|
|
|
00:02:44.959 --> 00:02:49.159 |
|
used for uh various reasons not |
|
|
|
00:02:47.400 --> 00:02:51.000 |
|
necessarily for direct human machine |
|
|
|
00:02:49.159 --> 00:02:52.720 |
|
communication but also for like |
|
|
|
00:02:51.000 --> 00:02:54.400 |
|
aggregating information across large |
|
|
|
00:02:52.720 --> 00:02:55.760 |
|
things for scientific studies and other |
|
|
|
00:02:54.400 --> 00:02:57.519 |
|
things like that I'll give a few |
|
|
|
00:02:55.760 --> 00:03:00.920 |
|
examples of |
|
|
|
00:02:57.519 --> 00:03:04.040 |
|
this um we now use an many times a day |
|
|
|
00:03:00.920 --> 00:03:06.480 |
|
sometimes without even knowing it so uh |
|
|
|
00:03:04.040 --> 00:03:09.400 |
|
whenever you're typing a doc in Google |
|
|
|
00:03:06.480 --> 00:03:11.599 |
|
Docs there's you know spell checking and |
|
|
|
00:03:09.400 --> 00:03:13.959 |
|
grammar checking going on behind it's |
|
|
|
00:03:11.599 --> 00:03:15.920 |
|
gotten frighten frighteningly good |
|
|
|
00:03:13.959 --> 00:03:18.280 |
|
recently that where it checks like most |
|
|
|
00:03:15.920 --> 00:03:20.720 |
|
of my mistakes and rarely Flags things |
|
|
|
00:03:18.280 --> 00:03:22.799 |
|
that are not mistakes so obviously they |
|
|
|
00:03:20.720 --> 00:03:25.080 |
|
have powerful models running behind that |
|
|
|
00:03:22.799 --> 00:03:25.080 |
|
uh |
|
|
|
00:03:25.640 --> 00:03:33.080 |
|
so and it can do things like answer |
|
|
|
00:03:28.720 --> 00:03:34.599 |
|
questions uh so I asked chat GPT who is |
|
|
|
00:03:33.080 --> 00:03:37.000 |
|
the current president of Carnegie melan |
|
|
|
00:03:34.599 --> 00:03:38.920 |
|
University and chat GPT said I did a |
|
|
|
00:03:37.000 --> 00:03:40.920 |
|
quick search for more information here |
|
|
|
00:03:38.920 --> 00:03:43.439 |
|
is what I found uh the current president |
|
|
|
00:03:40.920 --> 00:03:47.120 |
|
of car Mel University is faram Janan he |
|
|
|
00:03:43.439 --> 00:03:50.040 |
|
has been serving since July 1 etc etc so |
|
|
|
00:03:47.120 --> 00:03:50.040 |
|
as far as I can tell that's |
|
|
|
00:03:50.400 --> 00:03:56.319 |
|
correct um at the same time I asked how |
|
|
|
00:03:53.799 --> 00:04:00.280 |
|
many layers are included in the GP 3.5 |
|
|
|
00:03:56.319 --> 00:04:02.360 |
|
turbo architecture and it said to me |
|
|
|
00:04:00.280 --> 00:04:05.400 |
|
GPT 3.5 turbo which is an optimized |
|
|
|
00:04:02.360 --> 00:04:07.239 |
|
version of GPT 3.5 for faster responses |
|
|
|
00:04:05.400 --> 00:04:08.959 |
|
doesn't have a specific layer art |
|
|
|
00:04:07.239 --> 00:04:11.720 |
|
structure like the traditional gpt3 |
|
|
|
00:04:08.959 --> 00:04:13.560 |
|
models um and I don't know if this is |
|
|
|
00:04:11.720 --> 00:04:16.600 |
|
true or not but I'm pretty sure it's not |
|
|
|
00:04:13.560 --> 00:04:18.840 |
|
true I'm pretty sure that you know GPT |
|
|
|
00:04:16.600 --> 00:04:20.560 |
|
is a model that's much like other models |
|
|
|
00:04:18.840 --> 00:04:21.560 |
|
uh so it basically just made up the spec |
|
|
|
00:04:20.560 --> 00:04:22.880 |
|
because it didn't have any information |
|
|
|
00:04:21.560 --> 00:04:26.000 |
|
on the Internet or couldn't talk about |
|
|
|
00:04:22.880 --> 00:04:26.000 |
|
it so |
|
|
|
00:04:26.120 --> 00:04:33.479 |
|
um another thing is uh NLP can translate |
|
|
|
00:04:29.639 --> 00:04:37.759 |
|
text pretty well so I ran um Google |
|
|
|
00:04:33.479 --> 00:04:39.560 |
|
translate uh on Japanese uh this example |
|
|
|
00:04:37.759 --> 00:04:41.639 |
|
is a little bit old it's from uh you |
|
|
|
00:04:39.560 --> 00:04:44.639 |
|
know a few years ago about Co but I I |
|
|
|
00:04:41.639 --> 00:04:46.240 |
|
retranslated it a few days ago and it |
|
|
|
00:04:44.639 --> 00:04:47.680 |
|
comes up pretty good uh you can |
|
|
|
00:04:46.240 --> 00:04:49.639 |
|
basically understand what's going on |
|
|
|
00:04:47.680 --> 00:04:53.520 |
|
here it's not perfect but you can |
|
|
|
00:04:49.639 --> 00:04:56.400 |
|
understand the uh the general uh |
|
|
|
00:04:53.520 --> 00:04:58.560 |
|
gist at the same time uh if I put in a |
|
|
|
00:04:56.400 --> 00:05:02.280 |
|
relatively low resource language this is |
|
|
|
00:04:58.560 --> 00:05:05.759 |
|
Kurdish um it has a number of problems |
|
|
|
00:05:02.280 --> 00:05:08.199 |
|
when you try to understand it and just |
|
|
|
00:05:05.759 --> 00:05:12.400 |
|
to give an example this is talking about |
|
|
|
00:05:08.199 --> 00:05:14.320 |
|
uh some uh paleontology Discovery it |
|
|
|
00:05:12.400 --> 00:05:15.800 |
|
called this person a fossil scientist |
|
|
|
00:05:14.320 --> 00:05:17.440 |
|
instead of the kind of obvious English |
|
|
|
00:05:15.800 --> 00:05:20.120 |
|
term |
|
|
|
00:05:17.440 --> 00:05:23.520 |
|
paleontologist um and it's talking about |
|
|
|
00:05:20.120 --> 00:05:25.240 |
|
three different uh T-Rex species uh how |
|
|
|
00:05:23.520 --> 00:05:27.039 |
|
T-Rex should actually be split into |
|
|
|
00:05:25.240 --> 00:05:29.639 |
|
three species where T-Rex says king of |
|
|
|
00:05:27.039 --> 00:05:31.560 |
|
ferocious lizards emperator says emperor |
|
|
|
00:05:29.639 --> 00:05:33.720 |
|
of Savaged lizards and then T Regina |
|
|
|
00:05:31.560 --> 00:05:35.120 |
|
means clean of ferocious snail I'm |
|
|
|
00:05:33.720 --> 00:05:37.240 |
|
pretty sure that's not snail I'm pretty |
|
|
|
00:05:35.120 --> 00:05:41.080 |
|
sure that's lizard so uh you can see |
|
|
|
00:05:37.240 --> 00:05:41.080 |
|
that this is not uh this is not perfect |
|
|
|
00:05:41.280 --> 00:05:46.680 |
|
either some people might be thinking why |
|
|
|
00:05:43.960 --> 00:05:48.400 |
|
Google translate and why not GPD well it |
|
|
|
00:05:46.680 --> 00:05:49.960 |
|
turns out um according to one of the |
|
|
|
00:05:48.400 --> 00:05:51.759 |
|
recent studies we've done GPD is even |
|
|
|
00:05:49.960 --> 00:05:55.479 |
|
worse at these slow resource languages |
|
|
|
00:05:51.759 --> 00:05:58.120 |
|
so I use the best thing that's out |
|
|
|
00:05:55.479 --> 00:06:00.440 |
|
there um another thing is language |
|
|
|
00:05:58.120 --> 00:06:02.039 |
|
analysis can Aid scientific ific inquiry |
|
|
|
00:06:00.440 --> 00:06:03.600 |
|
so this is an example that I've been |
|
|
|
00:06:02.039 --> 00:06:06.120 |
|
using for a long time it's actually from |
|
|
|
00:06:03.600 --> 00:06:09.160 |
|
Martin sap another faculty member here |
|
|
|
00:06:06.120 --> 00:06:12.440 |
|
uh but I have been using it since uh |
|
|
|
00:06:09.160 --> 00:06:14.160 |
|
like before he joined and it uh this is |
|
|
|
00:06:12.440 --> 00:06:16.039 |
|
an example from computational social |
|
|
|
00:06:14.160 --> 00:06:18.599 |
|
science uh answering questions about |
|
|
|
00:06:16.039 --> 00:06:20.240 |
|
Society given observational data and |
|
|
|
00:06:18.599 --> 00:06:22.280 |
|
their question was do movie scripts |
|
|
|
00:06:20.240 --> 00:06:24.599 |
|
portray female or male characters with |
|
|
|
00:06:22.280 --> 00:06:27.520 |
|
more power or agency in movie script |
|
|
|
00:06:24.599 --> 00:06:30.120 |
|
films so it's asking kind of a so |
|
|
|
00:06:27.520 --> 00:06:32.160 |
|
societal question by using NLP |
|
|
|
00:06:30.120 --> 00:06:35.360 |
|
technology and the way they did it is |
|
|
|
00:06:32.160 --> 00:06:36.880 |
|
they basically analyzed text trying to |
|
|
|
00:06:35.360 --> 00:06:43.080 |
|
find |
|
|
|
00:06:36.880 --> 00:06:45.280 |
|
uh the uh agents and patients in a a |
|
|
|
00:06:43.080 --> 00:06:46.479 |
|
particular text which are the the things |
|
|
|
00:06:45.280 --> 00:06:49.280 |
|
that are doing things and the things |
|
|
|
00:06:46.479 --> 00:06:52.639 |
|
that things are being done to and you |
|
|
|
00:06:49.280 --> 00:06:54.440 |
|
can see that essentially male characters |
|
|
|
00:06:52.639 --> 00:06:56.560 |
|
in these movie scripts were given more |
|
|
|
00:06:54.440 --> 00:06:58.080 |
|
power in agency and female characters |
|
|
|
00:06:56.560 --> 00:06:59.960 |
|
were given less power in agency and they |
|
|
|
00:06:58.080 --> 00:07:02.680 |
|
were able to do this because they had |
|
|
|
00:06:59.960 --> 00:07:04.840 |
|
NLP technology that analyzed and |
|
|
|
00:07:02.680 --> 00:07:08.960 |
|
extracted useful data and made turned it |
|
|
|
00:07:04.840 --> 00:07:11.520 |
|
into a very easy form to do kind of |
|
|
|
00:07:08.960 --> 00:07:15.840 |
|
analysis of the variety that they want |
|
|
|
00:07:11.520 --> 00:07:17.400 |
|
so um I think that's a major use case of |
|
|
|
00:07:15.840 --> 00:07:19.400 |
|
NLP technology that does language |
|
|
|
00:07:17.400 --> 00:07:20.919 |
|
analysis nowadays turn it into a form |
|
|
|
00:07:19.400 --> 00:07:23.960 |
|
that allows you to very quickly do |
|
|
|
00:07:20.919 --> 00:07:27.440 |
|
aggregate queries and other things like |
|
|
|
00:07:23.960 --> 00:07:30.479 |
|
this um but at the same time uh language |
|
|
|
00:07:27.440 --> 00:07:33.520 |
|
analysis tools fail at very basic tasks |
|
|
|
00:07:30.479 --> 00:07:36.000 |
|
so these are |
|
|
|
00:07:33.520 --> 00:07:38.199 |
|
some things that I ran through a named |
|
|
|
00:07:36.000 --> 00:07:41.080 |
|
entity recognizer and these were kind of |
|
|
|
00:07:38.199 --> 00:07:43.160 |
|
very nice named entity recognizers uh |
|
|
|
00:07:41.080 --> 00:07:46.240 |
|
that a lot of people were using for |
|
|
|
00:07:43.160 --> 00:07:48.039 |
|
example Stanford core NLP and Spacey and |
|
|
|
00:07:46.240 --> 00:07:50.319 |
|
both of them I just threw in the first |
|
|
|
00:07:48.039 --> 00:07:53.120 |
|
thing that I found on the New York Times |
|
|
|
00:07:50.319 --> 00:07:55.199 |
|
at the time and it basically made at |
|
|
|
00:07:53.120 --> 00:07:58.319 |
|
least one mistake in the first sentence |
|
|
|
00:07:55.199 --> 00:08:00.840 |
|
and here it recognizes Baton Rouge as an |
|
|
|
00:07:58.319 --> 00:08:04.720 |
|
organization and here it recognized |
|
|
|
00:08:00.840 --> 00:08:07.000 |
|
hurricane EA as an organization so um |
|
|
|
00:08:04.720 --> 00:08:08.879 |
|
like even uh these things that we expect |
|
|
|
00:08:07.000 --> 00:08:10.360 |
|
should work pretty well make pretty |
|
|
|
00:08:08.879 --> 00:08:13.360 |
|
Solly |
|
|
|
00:08:10.360 --> 00:08:16.199 |
|
mistakes so in the class uh basically |
|
|
|
00:08:13.360 --> 00:08:18.479 |
|
what I want to cover is uh what goes |
|
|
|
00:08:16.199 --> 00:08:20.360 |
|
into building uh state-of-the-art NLP |
|
|
|
00:08:18.479 --> 00:08:24.000 |
|
systems that work really well on a wide |
|
|
|
00:08:20.360 --> 00:08:26.240 |
|
variety of tasks um where do current |
|
|
|
00:08:24.000 --> 00:08:28.840 |
|
systems |
|
|
|
00:08:26.240 --> 00:08:30.479 |
|
fail and how can we make appropriate |
|
|
|
00:08:28.840 --> 00:08:35.000 |
|
improvements and Achieve whatever we |
|
|
|
00:08:30.479 --> 00:08:37.719 |
|
want to do with nalp and this set of |
|
|
|
00:08:35.000 --> 00:08:39.360 |
|
questions that I'm asking here is |
|
|
|
00:08:37.719 --> 00:08:40.919 |
|
exactly the same as the set of questions |
|
|
|
00:08:39.360 --> 00:08:43.519 |
|
that I was asking two years ago before |
|
|
|
00:08:40.919 --> 00:08:45.480 |
|
chat GPT uh I still think they're |
|
|
|
00:08:43.519 --> 00:08:46.920 |
|
important questions but I think the |
|
|
|
00:08:45.480 --> 00:08:48.399 |
|
answers to these questions is very |
|
|
|
00:08:46.920 --> 00:08:50.040 |
|
different and because of that we're |
|
|
|
00:08:48.399 --> 00:08:52.120 |
|
updating the class materials to try to |
|
|
|
00:08:50.040 --> 00:08:54.399 |
|
cover you know the answers to these |
|
|
|
00:08:52.120 --> 00:08:56.000 |
|
questions and uh in kind of the era of |
|
|
|
00:08:54.399 --> 00:08:58.200 |
|
large language models and other things |
|
|
|
00:08:56.000 --> 00:08:59.720 |
|
like |
|
|
|
00:08:58.200 --> 00:09:02.079 |
|
that |
|
|
|
00:08:59.720 --> 00:09:03.360 |
|
so that's all I have for the intro maybe |
|
|
|
00:09:02.079 --> 00:09:06.640 |
|
maybe pretty straightforward are there |
|
|
|
00:09:03.360 --> 00:09:08.480 |
|
any questions or comments so far if not |
|
|
|
00:09:06.640 --> 00:09:14.399 |
|
I'll I'll just go |
|
|
|
00:09:08.480 --> 00:09:17.160 |
|
on okay great so I want to uh first go |
|
|
|
00:09:14.399 --> 00:09:19.480 |
|
into a very high Lev overview of NLP |
|
|
|
00:09:17.160 --> 00:09:20.839 |
|
system building and most of the stuff |
|
|
|
00:09:19.480 --> 00:09:22.399 |
|
that I want to do today is to set the |
|
|
|
00:09:20.839 --> 00:09:24.320 |
|
stage for what I'm going to be talking |
|
|
|
00:09:22.399 --> 00:09:25.040 |
|
about in more detail uh over the rest of |
|
|
|
00:09:24.320 --> 00:09:29.200 |
|
the |
|
|
|
00:09:25.040 --> 00:09:31.720 |
|
class and we could think of NLP syst |
|
|
|
00:09:29.200 --> 00:09:34.040 |
|
systems through this kind of General |
|
|
|
00:09:31.720 --> 00:09:36.560 |
|
framework where we want to create a |
|
|
|
00:09:34.040 --> 00:09:40.600 |
|
function to map an input X into an |
|
|
|
00:09:36.560 --> 00:09:44.440 |
|
output y uh where X and or Y involve |
|
|
|
00:09:40.600 --> 00:09:47.000 |
|
language and uh do some people have |
|
|
|
00:09:44.440 --> 00:09:50.120 |
|
favorite NLP tasks or NLP tasks that you |
|
|
|
00:09:47.000 --> 00:09:52.399 |
|
want to uh want to be handling in some |
|
|
|
00:09:50.120 --> 00:09:57.000 |
|
way or maybe what what do you think are |
|
|
|
00:09:52.399 --> 00:09:57.000 |
|
the most popular and important NLP tasks |
|
|
|
00:09:58.120 --> 00:10:03.200 |
|
nowadays |
|
|
|
00:10:00.800 --> 00:10:06.120 |
|
okay so translation is maybe easy what's |
|
|
|
00:10:03.200 --> 00:10:06.120 |
|
the input and output of |
|
|
|
00:10:11.440 --> 00:10:15.720 |
|
translation okay yeah so uh in |
|
|
|
00:10:13.800 --> 00:10:17.959 |
|
Translation inputs text in one language |
|
|
|
00:10:15.720 --> 00:10:21.760 |
|
output is text in another language and |
|
|
|
00:10:17.959 --> 00:10:21.760 |
|
then what what is a good |
|
|
|
00:10:27.680 --> 00:10:32.160 |
|
translation yeah corre or or the same is |
|
|
|
00:10:30.320 --> 00:10:35.839 |
|
the input basically yes um it also |
|
|
|
00:10:32.160 --> 00:10:37.760 |
|
should be fluent but I agree any other |
|
|
|
00:10:35.839 --> 00:10:39.839 |
|
things generation the reason why I said |
|
|
|
00:10:37.760 --> 00:10:41.519 |
|
it's tough is it's pretty broad um and |
|
|
|
00:10:39.839 --> 00:10:43.360 |
|
it's not like we could be doing |
|
|
|
00:10:41.519 --> 00:10:46.360 |
|
generation with lots of different inputs |
|
|
|
00:10:43.360 --> 00:10:51.440 |
|
but um yeah any any other things maybe a |
|
|
|
00:10:46.360 --> 00:10:51.440 |
|
little bit different yeah like |
|
|
|
00:10:51.480 --> 00:10:55.959 |
|
scenario a scenario and a multiple |
|
|
|
00:10:54.000 --> 00:10:58.200 |
|
choice question about the scenario and |
|
|
|
00:10:55.959 --> 00:10:59.680 |
|
so what would the scenario in the |
|
|
|
00:10:58.200 --> 00:11:01.760 |
|
multiple choice question are probably |
|
|
|
00:10:59.680 --> 00:11:04.040 |
|
the input and then the output |
|
|
|
00:11:01.760 --> 00:11:06.480 |
|
is an answer to the multiple choice |
|
|
|
00:11:04.040 --> 00:11:07.920 |
|
question um and then there it's kind of |
|
|
|
00:11:06.480 --> 00:11:12.279 |
|
obvious like what is good it's the |
|
|
|
00:11:07.920 --> 00:11:14.880 |
|
correct answer sure um interestingly I |
|
|
|
00:11:12.279 --> 00:11:17.440 |
|
think a lot of llm evaluation is done on |
|
|
|
00:11:14.880 --> 00:11:21.160 |
|
these multiple choice questions but I'm |
|
|
|
00:11:17.440 --> 00:11:22.320 |
|
yet to encounter an actual application |
|
|
|
00:11:21.160 --> 00:11:24.880 |
|
that cares about multiple choice |
|
|
|
00:11:22.320 --> 00:11:26.880 |
|
question answering so uh there's kind of |
|
|
|
00:11:24.880 --> 00:11:30.959 |
|
a funny disconnect there but uh yeah I |
|
|
|
00:11:26.880 --> 00:11:33.519 |
|
saw hand that think about V search comp |
|
|
|
00:11:30.959 --> 00:11:36.360 |
|
yeah Vector search uh that's very good |
|
|
|
00:11:33.519 --> 00:11:36.360 |
|
so the input |
|
|
|
00:11:37.120 --> 00:11:45.000 |
|
is can con it into or understanding and |
|
|
|
00:11:42.560 --> 00:11:45.000 |
|
it to |
|
|
|
00:11:47.360 --> 00:11:53.760 |
|
another okay yeah so I'd say the input |
|
|
|
00:11:49.880 --> 00:11:56.160 |
|
there is a query and a document base um |
|
|
|
00:11:53.760 --> 00:11:57.959 |
|
and then the output is maybe an index |
|
|
|
00:11:56.160 --> 00:11:59.800 |
|
into the document or or something else |
|
|
|
00:11:57.959 --> 00:12:01.279 |
|
like that sure um and then something |
|
|
|
00:11:59.800 --> 00:12:05.040 |
|
that's good here here's a good question |
|
|
|
00:12:01.279 --> 00:12:05.040 |
|
what what's a good result from |
|
|
|
00:12:06.560 --> 00:12:10.200 |
|
that what's a good |
|
|
|
00:12:10.839 --> 00:12:19.279 |
|
output be sort of simar the major |
|
|
|
00:12:15.560 --> 00:12:21.680 |
|
problem there I see is how you def SAR |
|
|
|
00:12:19.279 --> 00:12:26.199 |
|
and how you |
|
|
|
00:12:21.680 --> 00:12:29.760 |
|
a always like you understand |
|
|
|
00:12:26.199 --> 00:12:33.000 |
|
whether is actually |
|
|
|
00:12:29.760 --> 00:12:35.079 |
|
yeah exactly so that um just to repeat |
|
|
|
00:12:33.000 --> 00:12:36.880 |
|
it's like uh we need to have a |
|
|
|
00:12:35.079 --> 00:12:38.399 |
|
similarity a good similarity metric we |
|
|
|
00:12:36.880 --> 00:12:40.120 |
|
need to have a good threshold where we |
|
|
|
00:12:38.399 --> 00:12:41.760 |
|
get like the ones we want and we don't |
|
|
|
00:12:40.120 --> 00:12:43.240 |
|
get the ones we don't want we're going |
|
|
|
00:12:41.760 --> 00:12:44.959 |
|
to talk more about that in the retrieval |
|
|
|
00:12:43.240 --> 00:12:48.440 |
|
lecture exactly how we evaluate and |
|
|
|
00:12:44.959 --> 00:12:49.920 |
|
stuff but um yeah good so this is a good |
|
|
|
00:12:48.440 --> 00:12:53.279 |
|
uh here are some good examples I have |
|
|
|
00:12:49.920 --> 00:12:55.519 |
|
some examples of my own um the first one |
|
|
|
00:12:53.279 --> 00:12:58.360 |
|
is uh kind of the very generic one maybe |
|
|
|
00:12:55.519 --> 00:13:00.800 |
|
kind of like generation here but text in |
|
|
|
00:12:58.360 --> 00:13:02.959 |
|
continuing text uh so this is language |
|
|
|
00:13:00.800 --> 00:13:04.160 |
|
modeling so you have a text and then you |
|
|
|
00:13:02.959 --> 00:13:05.440 |
|
have the continuation you want to |
|
|
|
00:13:04.160 --> 00:13:07.680 |
|
predict the |
|
|
|
00:13:05.440 --> 00:13:10.480 |
|
continuation um text and text in another |
|
|
|
00:13:07.680 --> 00:13:13.040 |
|
language is translation uh text in a |
|
|
|
00:13:10.480 --> 00:13:15.800 |
|
label could be text classification uh |
|
|
|
00:13:13.040 --> 00:13:17.760 |
|
text in linguistic structure or uh some |
|
|
|
00:13:15.800 --> 00:13:21.360 |
|
s kind of entities or something like |
|
|
|
00:13:17.760 --> 00:13:22.680 |
|
that could be uh language analysis or um |
|
|
|
00:13:21.360 --> 00:13:24.839 |
|
information |
|
|
|
00:13:22.680 --> 00:13:29.440 |
|
extraction uh we could also have image |
|
|
|
00:13:24.839 --> 00:13:31.320 |
|
and text uh which is image captioning um |
|
|
|
00:13:29.440 --> 00:13:33.560 |
|
or speech and text which is speech |
|
|
|
00:13:31.320 --> 00:13:35.240 |
|
recognition and I take the very broad |
|
|
|
00:13:33.560 --> 00:13:38.000 |
|
view of natural language processing |
|
|
|
00:13:35.240 --> 00:13:39.519 |
|
which is if it's any variety of language |
|
|
|
00:13:38.000 --> 00:13:41.519 |
|
uh if you're handling language in some |
|
|
|
00:13:39.519 --> 00:13:42.800 |
|
way it's natural language processing it |
|
|
|
00:13:41.519 --> 00:13:45.880 |
|
doesn't necessarily have to be text |
|
|
|
00:13:42.800 --> 00:13:47.480 |
|
input text output um so that's relevant |
|
|
|
00:13:45.880 --> 00:13:50.199 |
|
for the projects that you're thinking |
|
|
|
00:13:47.480 --> 00:13:52.160 |
|
about too at the end of this course so |
|
|
|
00:13:50.199 --> 00:13:55.519 |
|
the the most common FAQ for this course |
|
|
|
00:13:52.160 --> 00:13:57.839 |
|
is does my project count and if you're |
|
|
|
00:13:55.519 --> 00:13:59.360 |
|
uncertain you should ask but usually |
|
|
|
00:13:57.839 --> 00:14:01.040 |
|
like if it has some sort of language |
|
|
|
00:13:59.360 --> 00:14:05.079 |
|
involved then I'll usually say yes it |
|
|
|
00:14:01.040 --> 00:14:07.920 |
|
does kind so um if it's like uh code to |
|
|
|
00:14:05.079 --> 00:14:09.680 |
|
code there that's not code is not |
|
|
|
00:14:07.920 --> 00:14:11.480 |
|
natural language it is language but it's |
|
|
|
00:14:09.680 --> 00:14:13.000 |
|
not natural language so that might be |
|
|
|
00:14:11.480 --> 00:14:15.320 |
|
borderline we might have to discuss |
|
|
|
00:14:13.000 --> 00:14:15.320 |
|
about |
|
|
|
00:14:15.759 --> 00:14:21.800 |
|
that cool um so next I'd like to talk |
|
|
|
00:14:18.880 --> 00:14:25.240 |
|
about methods for creating NLP systems |
|
|
|
00:14:21.800 --> 00:14:27.839 |
|
um and there's a lot of different ways |
|
|
|
00:14:25.240 --> 00:14:29.720 |
|
to create MLP systems all of these are |
|
|
|
00:14:27.839 --> 00:14:32.880 |
|
alive and well in |
|
|
|
00:14:29.720 --> 00:14:35.759 |
|
2024 uh the first one is Rule uh |
|
|
|
00:14:32.880 --> 00:14:37.959 |
|
rule-based system creation and so the |
|
|
|
00:14:35.759 --> 00:14:40.399 |
|
way this works is like let's say you |
|
|
|
00:14:37.959 --> 00:14:42.480 |
|
want to build a text classifier you just |
|
|
|
00:14:40.399 --> 00:14:46.560 |
|
write the simple python function that |
|
|
|
00:14:42.480 --> 00:14:48.639 |
|
classifies things into uh sports or |
|
|
|
00:14:46.560 --> 00:14:50.240 |
|
other and the way it classifies it into |
|
|
|
00:14:48.639 --> 00:14:52.959 |
|
sports or other is it checks whether |
|
|
|
00:14:50.240 --> 00:14:55.160 |
|
baseball soccer football and Tennis are |
|
|
|
00:14:52.959 --> 00:14:59.399 |
|
included in the document and classifies |
|
|
|
00:14:55.160 --> 00:15:01.959 |
|
it into uh Sports if so uh other if not |
|
|
|
00:14:59.399 --> 00:15:05.279 |
|
so has anyone written something like |
|
|
|
00:15:01.959 --> 00:15:09.720 |
|
this maybe not a text classifier but um |
|
|
|
00:15:05.279 --> 00:15:11.880 |
|
you know to identify entities or uh |
|
|
|
00:15:09.720 --> 00:15:14.279 |
|
split words |
|
|
|
00:15:11.880 --> 00:15:16.680 |
|
or something like |
|
|
|
00:15:14.279 --> 00:15:18.399 |
|
that has anybody not ever written |
|
|
|
00:15:16.680 --> 00:15:22.800 |
|
anything like |
|
|
|
00:15:18.399 --> 00:15:24.639 |
|
this yeah that's what I thought so um |
|
|
|
00:15:22.800 --> 00:15:26.079 |
|
rule-based systems are very convenient |
|
|
|
00:15:24.639 --> 00:15:28.920 |
|
when you don't really care about how |
|
|
|
00:15:26.079 --> 00:15:30.759 |
|
good your system is um or you're doing |
|
|
|
00:15:28.920 --> 00:15:32.360 |
|
that's really really simple and like |
|
|
|
00:15:30.759 --> 00:15:35.600 |
|
it'll be perfect even if you do the very |
|
|
|
00:15:32.360 --> 00:15:37.079 |
|
simple thing and so I I think it's worth |
|
|
|
00:15:35.600 --> 00:15:39.959 |
|
talking a little bit about them and I'll |
|
|
|
00:15:37.079 --> 00:15:43.319 |
|
talk a little bit about that uh this |
|
|
|
00:15:39.959 --> 00:15:45.680 |
|
time the second thing which like very |
|
|
|
00:15:43.319 --> 00:15:47.680 |
|
rapidly over the course of maybe three |
|
|
|
00:15:45.680 --> 00:15:50.279 |
|
years or so has become actually maybe |
|
|
|
00:15:47.680 --> 00:15:52.720 |
|
the dominant Paradigm in NLP is |
|
|
|
00:15:50.279 --> 00:15:56.360 |
|
prompting uh in prompting a language |
|
|
|
00:15:52.720 --> 00:15:58.560 |
|
model and the way this works is uh you |
|
|
|
00:15:56.360 --> 00:16:00.720 |
|
ask a language model if the following |
|
|
|
00:15:58.560 --> 00:16:03.079 |
|
sent is about sports reply Sports |
|
|
|
00:16:00.720 --> 00:16:06.120 |
|
otherwise reply other and you feed it to |
|
|
|
00:16:03.079 --> 00:16:08.480 |
|
your favorite LM uh usually that's GPT |
|
|
|
00:16:06.120 --> 00:16:11.399 |
|
something or other uh sometimes it's an |
|
|
|
00:16:08.480 --> 00:16:14.440 |
|
open source model of some variety and |
|
|
|
00:16:11.399 --> 00:16:17.759 |
|
then uh it will give you the |
|
|
|
00:16:14.440 --> 00:16:20.639 |
|
answer and then finally uh fine-tuning |
|
|
|
00:16:17.759 --> 00:16:22.240 |
|
uh so you take some paired data and you |
|
|
|
00:16:20.639 --> 00:16:23.600 |
|
do machine learning from paired data |
|
|
|
00:16:22.240 --> 00:16:25.680 |
|
where you have something like I love to |
|
|
|
00:16:23.600 --> 00:16:27.440 |
|
play baseball uh the stock price is |
|
|
|
00:16:25.680 --> 00:16:29.519 |
|
going up he got a hatrick yesterday he |
|
|
|
00:16:27.440 --> 00:16:32.759 |
|
is wearing tennis shoes and you assign |
|
|
|
00:16:29.519 --> 00:16:35.319 |
|
all these uh labels to them training a |
|
|
|
00:16:32.759 --> 00:16:38.160 |
|
model and you can even start out with a |
|
|
|
00:16:35.319 --> 00:16:41.480 |
|
prompting based model and fine-tune a a |
|
|
|
00:16:38.160 --> 00:16:41.480 |
|
language model |
|
|
|
00:16:42.920 --> 00:16:49.399 |
|
also so one major consideration when |
|
|
|
00:16:47.519 --> 00:16:52.000 |
|
you're Building Systems like this is the |
|
|
|
00:16:49.399 --> 00:16:56.440 |
|
data requirements for building such a |
|
|
|
00:16:52.000 --> 00:16:59.319 |
|
system and for rules or prompting where |
|
|
|
00:16:56.440 --> 00:17:02.240 |
|
it's just based on intuition really no |
|
|
|
00:16:59.319 --> 00:17:04.640 |
|
data is needed whatsoever it you don't |
|
|
|
00:17:02.240 --> 00:17:08.240 |
|
need a single example and you can start |
|
|
|
00:17:04.640 --> 00:17:11.000 |
|
writing rules or like just just to give |
|
|
|
00:17:08.240 --> 00:17:12.640 |
|
an example the rules and prompts I wrote |
|
|
|
00:17:11.000 --> 00:17:14.679 |
|
here I didn't look at any examples and I |
|
|
|
00:17:12.640 --> 00:17:17.240 |
|
just wrote them uh so this is something |
|
|
|
00:17:14.679 --> 00:17:20.000 |
|
that you could start out |
|
|
|
00:17:17.240 --> 00:17:21.559 |
|
with uh the problem is you also have no |
|
|
|
00:17:20.000 --> 00:17:24.720 |
|
idea how well it works if you don't have |
|
|
|
00:17:21.559 --> 00:17:26.760 |
|
any data whatsoever right so um you'll |
|
|
|
00:17:24.720 --> 00:17:30.400 |
|
you might be in trouble if you think |
|
|
|
00:17:26.760 --> 00:17:30.400 |
|
something should be working |
|
|
|
00:17:30.919 --> 00:17:34.440 |
|
so normally the next thing that people |
|
|
|
00:17:32.919 --> 00:17:36.880 |
|
move to nowadays when they're building |
|
|
|
00:17:34.440 --> 00:17:39.559 |
|
practical systems is rules are prompting |
|
|
|
00:17:36.880 --> 00:17:41.240 |
|
based on spot checks so that basically |
|
|
|
00:17:39.559 --> 00:17:42.919 |
|
means that you start out with a |
|
|
|
00:17:41.240 --> 00:17:45.840 |
|
rule-based system or a prompting based |
|
|
|
00:17:42.919 --> 00:17:47.240 |
|
system and then you go in and you run it |
|
|
|
00:17:45.840 --> 00:17:48.720 |
|
on some data that you're interested in |
|
|
|
00:17:47.240 --> 00:17:50.799 |
|
you just kind of qualitatively look at |
|
|
|
00:17:48.720 --> 00:17:52.160 |
|
the data and say oh it's messing up here |
|
|
|
00:17:50.799 --> 00:17:53.440 |
|
then you go in and fix your prompt a |
|
|
|
00:17:52.160 --> 00:17:54.919 |
|
little bit or you go in and fix your |
|
|
|
00:17:53.440 --> 00:17:57.320 |
|
rules a little bit or something like |
|
|
|
00:17:54.919 --> 00:18:00.400 |
|
that so uh this is kind of the second |
|
|
|
00:17:57.320 --> 00:18:00.400 |
|
level of difficulty |
|
|
|
00:18:01.400 --> 00:18:04.640 |
|
so the third level of difficulty would |
|
|
|
00:18:03.159 --> 00:18:07.400 |
|
be something like rules are prompting |
|
|
|
00:18:04.640 --> 00:18:09.039 |
|
with rigorous evaluation and so here you |
|
|
|
00:18:07.400 --> 00:18:12.840 |
|
would create a development set with |
|
|
|
00:18:09.039 --> 00:18:14.840 |
|
inputs and outputs uh so you uh create |
|
|
|
00:18:12.840 --> 00:18:17.039 |
|
maybe 200 to 2,000 |
|
|
|
00:18:14.840 --> 00:18:20.080 |
|
examples um |
|
|
|
00:18:17.039 --> 00:18:21.720 |
|
and then evaluate your actual accuracy |
|
|
|
00:18:20.080 --> 00:18:23.880 |
|
so you need an evaluation metric you |
|
|
|
00:18:21.720 --> 00:18:26.120 |
|
need other things like this this is the |
|
|
|
00:18:23.880 --> 00:18:28.400 |
|
next level of difficulty but if you're |
|
|
|
00:18:26.120 --> 00:18:30.240 |
|
going to be a serious you know NLP |
|
|
|
00:18:28.400 --> 00:18:33.000 |
|
engineer or something like this you |
|
|
|
00:18:30.240 --> 00:18:34.720 |
|
definitely will be doing this a lot I |
|
|
|
00:18:33.000 --> 00:18:37.760 |
|
feel and |
|
|
|
00:18:34.720 --> 00:18:40.360 |
|
then so that here now you start needing |
|
|
|
00:18:37.760 --> 00:18:41.960 |
|
a depth set and a test set and then |
|
|
|
00:18:40.360 --> 00:18:46.280 |
|
finally fine-tuning you need an |
|
|
|
00:18:41.960 --> 00:18:48.480 |
|
additional training set um and uh this |
|
|
|
00:18:46.280 --> 00:18:52.240 |
|
will generally be a lot bigger than 200 |
|
|
|
00:18:48.480 --> 00:18:56.080 |
|
to 2,000 examples and generally the rule |
|
|
|
00:18:52.240 --> 00:18:56.080 |
|
is that every time you |
|
|
|
00:18:57.320 --> 00:19:01.080 |
|
double |
|
|
|
00:18:59.520 --> 00:19:02.400 |
|
every time you double your training set |
|
|
|
00:19:01.080 --> 00:19:07.480 |
|
size you get about a constant |
|
|
|
00:19:02.400 --> 00:19:07.480 |
|
Improvement so if you start |
|
|
|
00:19:07.799 --> 00:19:15.080 |
|
out if you start out down here with |
|
|
|
00:19:12.240 --> 00:19:17.039 |
|
um zero shot accuracy with a language |
|
|
|
00:19:15.080 --> 00:19:21.559 |
|
model you you create a small printing |
|
|
|
00:19:17.039 --> 00:19:21.559 |
|
set and you get you know a pretty big |
|
|
|
00:19:22.000 --> 00:19:29.120 |
|
increase and then every time you double |
|
|
|
00:19:26.320 --> 00:19:30.799 |
|
it it increases by constant fact it's |
|
|
|
00:19:29.120 --> 00:19:32.480 |
|
kind of like just in general in machine |
|
|
|
00:19:30.799 --> 00:19:37.360 |
|
learning this is a trend that we tend to |
|
|
|
00:19:32.480 --> 00:19:40.679 |
|
see so um So based on this |
|
|
|
00:19:37.360 --> 00:19:41.880 |
|
uh there's kind of like you get a big |
|
|
|
00:19:40.679 --> 00:19:44.200 |
|
gain from having a little bit of |
|
|
|
00:19:41.880 --> 00:19:45.760 |
|
training data but the gains very quickly |
|
|
|
00:19:44.200 --> 00:19:48.919 |
|
drop off and you start spending a lot of |
|
|
|
00:19:45.760 --> 00:19:48.919 |
|
time annotating |
|
|
|
00:19:51.000 --> 00:19:55.880 |
|
an so um yeah this is the the general |
|
|
|
00:19:54.760 --> 00:19:58.280 |
|
overview of the different types of |
|
|
|
00:19:55.880 --> 00:20:00.000 |
|
system building uh any any question |
|
|
|
00:19:58.280 --> 00:20:01.559 |
|
questions about this or comments or |
|
|
|
00:20:00.000 --> 00:20:04.000 |
|
things like |
|
|
|
00:20:01.559 --> 00:20:05.840 |
|
this I think one thing that's changed |
|
|
|
00:20:04.000 --> 00:20:08.159 |
|
really drastically from the last time I |
|
|
|
00:20:05.840 --> 00:20:09.600 |
|
taught this class is the fact that |
|
|
|
00:20:08.159 --> 00:20:11.000 |
|
number one and number two are the things |
|
|
|
00:20:09.600 --> 00:20:13.799 |
|
that people are actually doing in |
|
|
|
00:20:11.000 --> 00:20:15.360 |
|
practice uh which was you know people |
|
|
|
00:20:13.799 --> 00:20:16.679 |
|
who actually care about systems are |
|
|
|
00:20:15.360 --> 00:20:18.880 |
|
doing number one and number two is the |
|
|
|
00:20:16.679 --> 00:20:20.440 |
|
main thing it used to be that if you |
|
|
|
00:20:18.880 --> 00:20:22.679 |
|
were actually serious about building a |
|
|
|
00:20:20.440 --> 00:20:24.320 |
|
system uh you really needed to do the |
|
|
|
00:20:22.679 --> 00:20:27.080 |
|
funing and now it's kind of like more |
|
|
|
00:20:24.320 --> 00:20:27.080 |
|
optional |
|
|
|
00:20:27.159 --> 00:20:30.159 |
|
so |
|
|
|
00:20:44.039 --> 00:20:50.960 |
|
yeah |
|
|
|
00:20:46.320 --> 00:20:53.960 |
|
so it's it's definitely an empirical |
|
|
|
00:20:50.960 --> 00:20:53.960 |
|
observation |
|
|
|
00:20:54.720 --> 00:21:01.080 |
|
um in terms of the theoretical |
|
|
|
00:20:57.640 --> 00:21:03.120 |
|
background I am not I can't immediately |
|
|
|
00:21:01.080 --> 00:21:05.840 |
|
point to a |
|
|
|
00:21:03.120 --> 00:21:10.039 |
|
particular paper that does that but I |
|
|
|
00:21:05.840 --> 00:21:12.720 |
|
think if you think about |
|
|
|
00:21:10.039 --> 00:21:14.720 |
|
the I I think I have seen that they do |
|
|
|
00:21:12.720 --> 00:21:17.039 |
|
exist in the past but I I can't think of |
|
|
|
00:21:14.720 --> 00:21:19.000 |
|
it right now I can try to uh try to come |
|
|
|
00:21:17.039 --> 00:21:23.720 |
|
up with an example of |
|
|
|
00:21:19.000 --> 00:21:23.720 |
|
that so yeah I I should take |
|
|
|
00:21:26.799 --> 00:21:31.960 |
|
notes or someone wants to share one on |
|
|
|
00:21:29.360 --> 00:21:33.360 |
|
Piaza uh if you have any ideas and want |
|
|
|
00:21:31.960 --> 00:21:34.520 |
|
to share on Patza I'm sure that would be |
|
|
|
00:21:33.360 --> 00:21:35.640 |
|
great it'd be great to have a discussion |
|
|
|
00:21:34.520 --> 00:21:39.320 |
|
on |
|
|
|
00:21:35.640 --> 00:21:44.960 |
|
Patza um Pi |
|
|
|
00:21:39.320 --> 00:21:46.880 |
|
one cool okay so next I want to try to |
|
|
|
00:21:44.960 --> 00:21:48.200 |
|
make a rule-based system and I'm going |
|
|
|
00:21:46.880 --> 00:21:49.360 |
|
to make a rule-based system for |
|
|
|
00:21:48.200 --> 00:21:51.799 |
|
sentiment |
|
|
|
00:21:49.360 --> 00:21:53.480 |
|
analysis uh and this is a bad idea I |
|
|
|
00:21:51.799 --> 00:21:55.400 |
|
would not encourage you to ever do this |
|
|
|
00:21:53.480 --> 00:21:57.440 |
|
in real life but I want to do it here to |
|
|
|
00:21:55.400 --> 00:21:59.640 |
|
show you why it's a bad idea and like |
|
|
|
00:21:57.440 --> 00:22:01.200 |
|
what are some of the hard problems that |
|
|
|
00:21:59.640 --> 00:22:03.960 |
|
you encounter when trying to create a |
|
|
|
00:22:01.200 --> 00:22:06.600 |
|
system based on rules |
|
|
|
00:22:03.960 --> 00:22:08.080 |
|
and then we'll move into building a |
|
|
|
00:22:06.600 --> 00:22:12.360 |
|
machine learning base system after we |
|
|
|
00:22:08.080 --> 00:22:15.400 |
|
finish this so if we look at the example |
|
|
|
00:22:12.360 --> 00:22:18.559 |
|
test this is review sentiment analysis |
|
|
|
00:22:15.400 --> 00:22:21.799 |
|
it's one of the most valuable uh tasks |
|
|
|
00:22:18.559 --> 00:22:24.039 |
|
uh that people do in NLP nowadays |
|
|
|
00:22:21.799 --> 00:22:26.400 |
|
because it allows people to know how |
|
|
|
00:22:24.039 --> 00:22:29.200 |
|
customers are thinking about products uh |
|
|
|
00:22:26.400 --> 00:22:30.799 |
|
improve their you know their product |
|
|
|
00:22:29.200 --> 00:22:32.919 |
|
development and other things like that |
|
|
|
00:22:30.799 --> 00:22:34.799 |
|
may monitor people's you know |
|
|
|
00:22:32.919 --> 00:22:36.760 |
|
satisfaction with their social media |
|
|
|
00:22:34.799 --> 00:22:39.200 |
|
service other things like this so |
|
|
|
00:22:36.760 --> 00:22:42.720 |
|
basically the way it works is um you |
|
|
|
00:22:39.200 --> 00:22:44.400 |
|
have uh outputs or you have sentences |
|
|
|
00:22:42.720 --> 00:22:46.720 |
|
inputs like I hate this movie I love |
|
|
|
00:22:44.400 --> 00:22:48.520 |
|
this movie I saw this movie and this |
|
|
|
00:22:46.720 --> 00:22:50.600 |
|
gets mapped into positive neutral or |
|
|
|
00:22:48.520 --> 00:22:53.120 |
|
negative so I hate this movie would be |
|
|
|
00:22:50.600 --> 00:22:55.480 |
|
negative I love this movie positive and |
|
|
|
00:22:53.120 --> 00:22:59.039 |
|
I saw this movie is |
|
|
|
00:22:55.480 --> 00:23:01.200 |
|
neutral so um |
|
|
|
00:22:59.039 --> 00:23:05.200 |
|
that that's the task input tax output |
|
|
|
00:23:01.200 --> 00:23:08.880 |
|
labels uh Kary uh sentence |
|
|
|
00:23:05.200 --> 00:23:11.679 |
|
label and in order to do this uh we |
|
|
|
00:23:08.880 --> 00:23:13.120 |
|
would like to build a model um and we're |
|
|
|
00:23:11.679 --> 00:23:16.159 |
|
going to build the model in a rule based |
|
|
|
00:23:13.120 --> 00:23:19.000 |
|
way but it we'll still call it a model |
|
|
|
00:23:16.159 --> 00:23:21.600 |
|
and the way it works is we do feature |
|
|
|
00:23:19.000 --> 00:23:23.159 |
|
extraction um so we extract the Salient |
|
|
|
00:23:21.600 --> 00:23:25.279 |
|
features for making the decision about |
|
|
|
00:23:23.159 --> 00:23:27.320 |
|
what to Output next we do score |
|
|
|
00:23:25.279 --> 00:23:29.880 |
|
calculation calculate a score for one or |
|
|
|
00:23:27.320 --> 00:23:32.320 |
|
more possib ities and we have a decision |
|
|
|
00:23:29.880 --> 00:23:33.520 |
|
function so we choose one of those |
|
|
|
00:23:32.320 --> 00:23:37.679 |
|
several |
|
|
|
00:23:33.520 --> 00:23:40.120 |
|
possibilities and so for feature |
|
|
|
00:23:37.679 --> 00:23:42.200 |
|
extraction uh formally what this looks |
|
|
|
00:23:40.120 --> 00:23:44.240 |
|
like is we have some function and it |
|
|
|
00:23:42.200 --> 00:23:48.039 |
|
extracts a feature |
|
|
|
00:23:44.240 --> 00:23:51.159 |
|
Vector for score calculation um we |
|
|
|
00:23:48.039 --> 00:23:54.240 |
|
calculate the scores based on either a |
|
|
|
00:23:51.159 --> 00:23:56.279 |
|
binary classification uh where we have a |
|
|
|
00:23:54.240 --> 00:23:58.279 |
|
a weight vector and we take the dot |
|
|
|
00:23:56.279 --> 00:24:00.120 |
|
product with our feature vector or we |
|
|
|
00:23:58.279 --> 00:24:02.480 |
|
have multi class classification where we |
|
|
|
00:24:00.120 --> 00:24:04.520 |
|
have a weight Matrix and we take the |
|
|
|
00:24:02.480 --> 00:24:08.640 |
|
product with uh the vector and that |
|
|
|
00:24:04.520 --> 00:24:08.640 |
|
gives us you know squares over multiple |
|
|
|
00:24:08.919 --> 00:24:14.840 |
|
classes and then we have a decision uh |
|
|
|
00:24:11.600 --> 00:24:17.520 |
|
rule so this decision rule tells us what |
|
|
|
00:24:14.840 --> 00:24:20.080 |
|
the output is going to be um does anyone |
|
|
|
00:24:17.520 --> 00:24:22.200 |
|
know what a typical decision rule is |
|
|
|
00:24:20.080 --> 00:24:24.520 |
|
maybe maybe so obvious that you don't |
|
|
|
00:24:22.200 --> 00:24:28.760 |
|
think about it often |
|
|
|
00:24:24.520 --> 00:24:31.000 |
|
but uh a threshold um so like for would |
|
|
|
00:24:28.760 --> 00:24:34.440 |
|
that be for binary a single binary |
|
|
|
00:24:31.000 --> 00:24:37.000 |
|
scaler score or a multiple |
|
|
|
00:24:34.440 --> 00:24:38.520 |
|
class binary yeah so and then you would |
|
|
|
00:24:37.000 --> 00:24:39.960 |
|
pick a threshold and if it's over the |
|
|
|
00:24:38.520 --> 00:24:42.919 |
|
threshold |
|
|
|
00:24:39.960 --> 00:24:45.760 |
|
you say yes and if it's under the |
|
|
|
00:24:42.919 --> 00:24:50.279 |
|
threshold you say no um another option |
|
|
|
00:24:45.760 --> 00:24:51.679 |
|
would be um you have a threshold and you |
|
|
|
00:24:50.279 --> 00:24:56.080 |
|
say |
|
|
|
00:24:51.679 --> 00:24:56.080 |
|
yes no |
|
|
|
00:24:56.200 --> 00:25:00.559 |
|
obain so you know you don't give an |
|
|
|
00:24:58.360 --> 00:25:02.520 |
|
answer and depending on how you're |
|
|
|
00:25:00.559 --> 00:25:03.720 |
|
evaluated what what is a good classifier |
|
|
|
00:25:02.520 --> 00:25:07.799 |
|
you might want to abstain some of the |
|
|
|
00:25:03.720 --> 00:25:10.960 |
|
time also um for multiclass what what's |
|
|
|
00:25:07.799 --> 00:25:10.960 |
|
a standard decision role for |
|
|
|
00:25:11.120 --> 00:25:16.720 |
|
multiclass argmax yeah exactly so um |
|
|
|
00:25:14.279 --> 00:25:19.520 |
|
basically you you find the index that |
|
|
|
00:25:16.720 --> 00:25:22.000 |
|
has the highest score in you output |
|
|
|
00:25:19.520 --> 00:25:24.480 |
|
it we're going to be talking about other |
|
|
|
00:25:22.000 --> 00:25:26.559 |
|
decision rules also um like |
|
|
|
00:25:24.480 --> 00:25:29.480 |
|
self-consistency and minimum based risk |
|
|
|
00:25:26.559 --> 00:25:30.760 |
|
later uh for text generation so you can |
|
|
|
00:25:29.480 --> 00:25:33.000 |
|
just keep that in mind and then we'll |
|
|
|
00:25:30.760 --> 00:25:36.279 |
|
forget about it for like several |
|
|
|
00:25:33.000 --> 00:25:39.559 |
|
classes um so for sentiment |
|
|
|
00:25:36.279 --> 00:25:42.159 |
|
class um I have a Cod |
|
|
|
00:25:39.559 --> 00:25:45.159 |
|
walk |
|
|
|
00:25:42.159 --> 00:25:45.159 |
|
here |
|
|
|
00:25:46.240 --> 00:25:54.320 |
|
and this is pretty simple um but if |
|
|
|
00:25:50.320 --> 00:25:58.559 |
|
you're bored uh of the class and would |
|
|
|
00:25:54.320 --> 00:26:01.000 |
|
like to um try out yourself you can |
|
|
|
00:25:58.559 --> 00:26:04.480 |
|
Challenge and try to get a better score |
|
|
|
00:26:01.000 --> 00:26:06.120 |
|
than I do um over the next few minutes |
|
|
|
00:26:04.480 --> 00:26:06.880 |
|
but we have this rule based classifier |
|
|
|
00:26:06.120 --> 00:26:10.240 |
|
in |
|
|
|
00:26:06.880 --> 00:26:12.640 |
|
here and I will open it up in my vs |
|
|
|
00:26:10.240 --> 00:26:15.360 |
|
code |
|
|
|
00:26:12.640 --> 00:26:18.360 |
|
to try to create a rule-based classifier |
|
|
|
00:26:15.360 --> 00:26:18.360 |
|
and basically the way this |
|
|
|
00:26:22.799 --> 00:26:29.960 |
|
works is |
|
|
|
00:26:25.159 --> 00:26:29.960 |
|
that we have a feature |
|
|
|
00:26:31.720 --> 00:26:37.720 |
|
extraction we have feature extraction we |
|
|
|
00:26:34.120 --> 00:26:40.679 |
|
have scoring and we have um a decision |
|
|
|
00:26:37.720 --> 00:26:43.480 |
|
rle so here for our feature extraction I |
|
|
|
00:26:40.679 --> 00:26:44.720 |
|
have created a list of good words and a |
|
|
|
00:26:43.480 --> 00:26:46.720 |
|
list of bad |
|
|
|
00:26:44.720 --> 00:26:48.960 |
|
words |
|
|
|
00:26:46.720 --> 00:26:51.320 |
|
and what we do is we just count the |
|
|
|
00:26:48.960 --> 00:26:53.000 |
|
number of good words that appeared and |
|
|
|
00:26:51.320 --> 00:26:55.320 |
|
count the number of bad words that |
|
|
|
00:26:53.000 --> 00:26:57.880 |
|
appeared then we also have a bias |
|
|
|
00:26:55.320 --> 00:27:01.159 |
|
feature so the bias feature is a feature |
|
|
|
00:26:57.880 --> 00:27:03.679 |
|
that's always one and so what that |
|
|
|
00:27:01.159 --> 00:27:06.799 |
|
results in is we have a dimension three |
|
|
|
00:27:03.679 --> 00:27:08.880 |
|
feature Vector um where this is like the |
|
|
|
00:27:06.799 --> 00:27:11.320 |
|
number of good words this is the number |
|
|
|
00:27:08.880 --> 00:27:15.320 |
|
of bad words and then you have the |
|
|
|
00:27:11.320 --> 00:27:17.760 |
|
bias and then I also Define the feature |
|
|
|
00:27:15.320 --> 00:27:20.039 |
|
weights that so for every good word we |
|
|
|
00:27:17.760 --> 00:27:22.200 |
|
add one to our score for every bad word |
|
|
|
00:27:20.039 --> 00:27:25.559 |
|
we add uh we subtract one from our score |
|
|
|
00:27:22.200 --> 00:27:29.399 |
|
and for the BIOS we absor and so we then |
|
|
|
00:27:25.559 --> 00:27:30.480 |
|
take the dot product between |
|
|
|
00:27:29.399 --> 00:27:34.360 |
|
these |
|
|
|
00:27:30.480 --> 00:27:36.919 |
|
two and we get minus |
|
|
|
00:27:34.360 --> 00:27:37.640 |
|
0.5 and that gives us uh that gives us |
|
|
|
00:27:36.919 --> 00:27:41.000 |
|
the |
|
|
|
00:27:37.640 --> 00:27:46.000 |
|
squore so let's run |
|
|
|
00:27:41.000 --> 00:27:50.320 |
|
that um and I read in some |
|
|
|
00:27:46.000 --> 00:27:52.600 |
|
data and what this data looks like is |
|
|
|
00:27:50.320 --> 00:27:55.000 |
|
basically we have a |
|
|
|
00:27:52.600 --> 00:27:57.559 |
|
review um which says the rock is |
|
|
|
00:27:55.000 --> 00:27:59.480 |
|
destined to be the 21st Century's new |
|
|
|
00:27:57.559 --> 00:28:01.240 |
|
Conan and that he's going to make a |
|
|
|
00:27:59.480 --> 00:28:03.600 |
|
splash even greater than Arnold |
|
|
|
00:28:01.240 --> 00:28:07.000 |
|
Schwarzenegger jeanclaude vanam or |
|
|
|
00:28:03.600 --> 00:28:09.519 |
|
Steven Seagal um so this seems pretty |
|
|
|
00:28:07.000 --> 00:28:10.840 |
|
positive right I like that's a pretty |
|
|
|
00:28:09.519 --> 00:28:13.200 |
|
high order to be better than Arnold |
|
|
|
00:28:10.840 --> 00:28:16.080 |
|
Schwarzenegger or John Claude vanam uh |
|
|
|
00:28:13.200 --> 00:28:19.519 |
|
if you're familiar with action movies um |
|
|
|
00:28:16.080 --> 00:28:22.840 |
|
and so of course this gets a positive |
|
|
|
00:28:19.519 --> 00:28:24.120 |
|
label and so uh we have run classifier |
|
|
|
00:28:22.840 --> 00:28:25.240 |
|
actually maybe I should call this |
|
|
|
00:28:24.120 --> 00:28:27.600 |
|
decision rule because this is |
|
|
|
00:28:25.240 --> 00:28:29.120 |
|
essentially our decision Rule and here |
|
|
|
00:28:27.600 --> 00:28:32.600 |
|
basically do the thing that I mentioned |
|
|
|
00:28:29.120 --> 00:28:35.440 |
|
here the yes no obstain or in this case |
|
|
|
00:28:32.600 --> 00:28:38.360 |
|
positive negative neutral so if the |
|
|
|
00:28:35.440 --> 00:28:40.159 |
|
score is greater than zero we uh return |
|
|
|
00:28:38.360 --> 00:28:42.480 |
|
one if the score is less than zero we |
|
|
|
00:28:40.159 --> 00:28:44.679 |
|
return negative one which is negative |
|
|
|
00:28:42.480 --> 00:28:47.240 |
|
and otherwise we returns |
|
|
|
00:28:44.679 --> 00:28:48.760 |
|
zero um we have an accuracy calculation |
|
|
|
00:28:47.240 --> 00:28:51.519 |
|
function just calculating the outputs |
|
|
|
00:28:48.760 --> 00:28:55.840 |
|
are good and |
|
|
|
00:28:51.519 --> 00:28:57.440 |
|
um this is uh the overall label count in |
|
|
|
00:28:55.840 --> 00:28:59.919 |
|
the in the output so we can see there |
|
|
|
00:28:57.440 --> 00:29:03.120 |
|
slightly more positives than there are |
|
|
|
00:28:59.919 --> 00:29:06.080 |
|
negatives and then we can run this and |
|
|
|
00:29:03.120 --> 00:29:10.200 |
|
we get a a score of |
|
|
|
00:29:06.080 --> 00:29:14.760 |
|
43 and so one one thing that I have |
|
|
|
00:29:10.200 --> 00:29:19.279 |
|
found um is I I do a lot of kind |
|
|
|
00:29:14.760 --> 00:29:21.240 |
|
of research on how to make NLP systems |
|
|
|
00:29:19.279 --> 00:29:23.600 |
|
better and one of the things I found |
|
|
|
00:29:21.240 --> 00:29:26.679 |
|
really invaluable |
|
|
|
00:29:23.600 --> 00:29:27.840 |
|
is if you're in a situation where you |
|
|
|
00:29:26.679 --> 00:29:29.720 |
|
have a |
|
|
|
00:29:27.840 --> 00:29:31.760 |
|
set task and you just want to make the |
|
|
|
00:29:29.720 --> 00:29:33.760 |
|
system better on the set task doing |
|
|
|
00:29:31.760 --> 00:29:35.159 |
|
comprehensive error analysis and |
|
|
|
00:29:33.760 --> 00:29:37.320 |
|
understanding where your system is |
|
|
|
00:29:35.159 --> 00:29:39.880 |
|
failing is one of the best ways to do |
|
|
|
00:29:37.320 --> 00:29:42.200 |
|
that and I would like to do a very |
|
|
|
00:29:39.880 --> 00:29:43.640 |
|
rudimentary version of this here and |
|
|
|
00:29:42.200 --> 00:29:46.519 |
|
what I'm doing essentially is I'm just |
|
|
|
00:29:43.640 --> 00:29:47.480 |
|
randomly picking uh several examples |
|
|
|
00:29:46.519 --> 00:29:49.320 |
|
that were |
|
|
|
00:29:47.480 --> 00:29:52.000 |
|
correct |
|
|
|
00:29:49.320 --> 00:29:54.840 |
|
um and so like let let's look at the |
|
|
|
00:29:52.000 --> 00:29:58.200 |
|
examples here um here the true label is |
|
|
|
00:29:54.840 --> 00:30:00.760 |
|
zero um in this predicted one um it may |
|
|
|
00:29:58.200 --> 00:30:03.440 |
|
not be as cutting as Woody or as true as |
|
|
|
00:30:00.760 --> 00:30:05.039 |
|
back in the Glory Days of uh weekend and |
|
|
|
00:30:03.440 --> 00:30:07.440 |
|
two or three things that I know about |
|
|
|
00:30:05.039 --> 00:30:09.640 |
|
her but who else engaged in film Mak |
|
|
|
00:30:07.440 --> 00:30:12.679 |
|
today is so cognizant of the cultural |
|
|
|
00:30:09.640 --> 00:30:14.480 |
|
and moral issues involved in the process |
|
|
|
00:30:12.679 --> 00:30:17.600 |
|
so what words in here are a good |
|
|
|
00:30:14.480 --> 00:30:20.840 |
|
indication that this is a neutral |
|
|
|
00:30:17.600 --> 00:30:20.840 |
|
sentence any |
|
|
|
00:30:23.760 --> 00:30:28.399 |
|
ideas little bit tough |
|
|
|
00:30:26.240 --> 00:30:30.919 |
|
huh starting to think maybe we should be |
|
|
|
00:30:28.399 --> 00:30:30.919 |
|
using machine |
|
|
|
00:30:31.480 --> 00:30:37.440 |
|
learning |
|
|
|
00:30:34.080 --> 00:30:40.320 |
|
um even by the intentionally low |
|
|
|
00:30:37.440 --> 00:30:41.559 |
|
standards of fratboy humor sority boys |
|
|
|
00:30:40.320 --> 00:30:43.840 |
|
is a |
|
|
|
00:30:41.559 --> 00:30:46.080 |
|
Bowser I think frat boy is maybe |
|
|
|
00:30:43.840 --> 00:30:47.360 |
|
negative sentiment if you're familiar |
|
|
|
00:30:46.080 --> 00:30:50.360 |
|
with |
|
|
|
00:30:47.360 --> 00:30:51.960 |
|
us us I don't have any negative |
|
|
|
00:30:50.360 --> 00:30:54.519 |
|
sentiment but the people who say it that |
|
|
|
00:30:51.960 --> 00:30:55.960 |
|
way have negative senent maybe so if we |
|
|
|
00:30:54.519 --> 00:31:01.080 |
|
wanted to go in and do that we could |
|
|
|
00:30:55.960 --> 00:31:01.080 |
|
maybe I won't save this but |
|
|
|
00:31:01.519 --> 00:31:08.919 |
|
uh |
|
|
|
00:31:04.240 --> 00:31:11.840 |
|
um oh whoops I'll go back and fix it uh |
|
|
|
00:31:08.919 --> 00:31:14.840 |
|
crass crass is pretty obviously negative |
|
|
|
00:31:11.840 --> 00:31:14.840 |
|
right so I can add |
|
|
|
00:31:17.039 --> 00:31:21.080 |
|
crass actually let me just add |
|
|
|
00:31:21.760 --> 00:31:29.159 |
|
CR and then um I'll go back and have our |
|
|
|
00:31:26.559 --> 00:31:29.159 |
|
train accurate |
|
|
|
00:31:32.159 --> 00:31:36.240 |
|
wa maybe maybe I need to run the whole |
|
|
|
00:31:33.960 --> 00:31:36.240 |
|
thing |
|
|
|
00:31:36.960 --> 00:31:39.960 |
|
again |
|
|
|
00:31:40.960 --> 00:31:45.880 |
|
and that budg the training accuracy a |
|
|
|
00:31:43.679 --> 00:31:50.360 |
|
little um the dev test accuracy not very |
|
|
|
00:31:45.880 --> 00:31:53.919 |
|
much so I could go through and do this |
|
|
|
00:31:50.360 --> 00:31:53.919 |
|
um let me add |
|
|
|
00:31:54.000 --> 00:31:58.320 |
|
unengaging so I could go through and do |
|
|
|
00:31:56.000 --> 00:32:01.720 |
|
this all day and you probably be very |
|
|
|
00:31:58.320 --> 00:32:01.720 |
|
bored on |
|
|
|
00:32:04.240 --> 00:32:08.360 |
|
engage but I won't do that uh because we |
|
|
|
00:32:06.919 --> 00:32:10.679 |
|
have much more important things to be |
|
|
|
00:32:08.360 --> 00:32:14.679 |
|
doing |
|
|
|
00:32:10.679 --> 00:32:16.440 |
|
um and uh so anyway we um we could go |
|
|
|
00:32:14.679 --> 00:32:18.919 |
|
through and design all the features here |
|
|
|
00:32:16.440 --> 00:32:21.279 |
|
but like why is this complicated like |
|
|
|
00:32:18.919 --> 00:32:22.600 |
|
the the reason why it was complicated |
|
|
|
00:32:21.279 --> 00:32:25.840 |
|
became pretty |
|
|
|
00:32:22.600 --> 00:32:27.840 |
|
clear from the uh from the very |
|
|
|
00:32:25.840 --> 00:32:29.639 |
|
beginning uh the very first example I |
|
|
|
00:32:27.840 --> 00:32:32.200 |
|
showed you which was that was a really |
|
|
|
00:32:29.639 --> 00:32:34.720 |
|
complicated sentence like all of us |
|
|
|
00:32:32.200 --> 00:32:36.240 |
|
could see that it wasn't like really |
|
|
|
00:32:34.720 --> 00:32:38.679 |
|
strongly positive it wasn't really |
|
|
|
00:32:36.240 --> 00:32:40.519 |
|
strongly negative it was kind of like in |
|
|
|
00:32:38.679 --> 00:32:42.919 |
|
the middle but it was in the middle and |
|
|
|
00:32:40.519 --> 00:32:44.600 |
|
it said it in a very long way uh you |
|
|
|
00:32:42.919 --> 00:32:46.120 |
|
know not using any clearly positive |
|
|
|
00:32:44.600 --> 00:32:47.639 |
|
sentiment words not using any clearly |
|
|
|
00:32:46.120 --> 00:32:49.760 |
|
negative sentiment |
|
|
|
00:32:47.639 --> 00:32:53.760 |
|
words |
|
|
|
00:32:49.760 --> 00:32:56.519 |
|
um so yeah basically I I |
|
|
|
00:32:53.760 --> 00:33:00.559 |
|
improved um but what are the difficult |
|
|
|
00:32:56.519 --> 00:33:03.720 |
|
cases uh that we saw here so the first |
|
|
|
00:33:00.559 --> 00:33:07.639 |
|
one is low frequency |
|
|
|
00:33:03.720 --> 00:33:09.760 |
|
words so um here's an example the action |
|
|
|
00:33:07.639 --> 00:33:11.519 |
|
switches between past and present but |
|
|
|
00:33:09.760 --> 00:33:13.120 |
|
the material link is too tenuous to |
|
|
|
00:33:11.519 --> 00:33:16.840 |
|
Anchor the emotional connections at |
|
|
|
00:33:13.120 --> 00:33:19.519 |
|
purport to span a 125 year divide so |
|
|
|
00:33:16.840 --> 00:33:21.080 |
|
this is negative um tenuous is kind of a |
|
|
|
00:33:19.519 --> 00:33:22.799 |
|
negative word purport is kind of a |
|
|
|
00:33:21.080 --> 00:33:24.760 |
|
negative word but it doesn't appear very |
|
|
|
00:33:22.799 --> 00:33:26.159 |
|
frequently so I would need to spend all |
|
|
|
00:33:24.760 --> 00:33:29.720 |
|
my time looking for these words and |
|
|
|
00:33:26.159 --> 00:33:32.480 |
|
trying to them in um here's yet another |
|
|
|
00:33:29.720 --> 00:33:34.240 |
|
horse franchise mucking up its storyline |
|
|
|
00:33:32.480 --> 00:33:36.639 |
|
with glitches casual fans could correct |
|
|
|
00:33:34.240 --> 00:33:40.159 |
|
in their sleep negative |
|
|
|
00:33:36.639 --> 00:33:42.600 |
|
again um so the solutions here are keep |
|
|
|
00:33:40.159 --> 00:33:46.880 |
|
working until we get all of them which |
|
|
|
00:33:42.600 --> 00:33:49.159 |
|
is maybe not super fun um or incorporate |
|
|
|
00:33:46.880 --> 00:33:51.639 |
|
external resources such as sentiment |
|
|
|
00:33:49.159 --> 00:33:52.880 |
|
dictionaries that people created uh we |
|
|
|
00:33:51.639 --> 00:33:55.960 |
|
could do that but that's a lot of |
|
|
|
00:33:52.880 --> 00:33:57.480 |
|
engineering effort to make something |
|
|
|
00:33:55.960 --> 00:34:00.639 |
|
work |
|
|
|
00:33:57.480 --> 00:34:03.720 |
|
um another one is conjugation so we saw |
|
|
|
00:34:00.639 --> 00:34:06.600 |
|
unengaging I guess that's an example of |
|
|
|
00:34:03.720 --> 00:34:08.359 |
|
conjugation uh some other ones are |
|
|
|
00:34:06.600 --> 00:34:10.520 |
|
operatic sprawling picture that's |
|
|
|
00:34:08.359 --> 00:34:12.040 |
|
entertainingly acted magnificently shot |
|
|
|
00:34:10.520 --> 00:34:15.480 |
|
and gripping enough to sustain most of |
|
|
|
00:34:12.040 --> 00:34:17.399 |
|
its 170 minute length so here we have |
|
|
|
00:34:15.480 --> 00:34:19.079 |
|
magnificently so even if I added |
|
|
|
00:34:17.399 --> 00:34:20.480 |
|
magnificent this wouldn't have been |
|
|
|
00:34:19.079 --> 00:34:23.800 |
|
clocked |
|
|
|
00:34:20.480 --> 00:34:26.599 |
|
right um it's basically an overlong |
|
|
|
00:34:23.800 --> 00:34:28.839 |
|
episode of tales from the cryp so that's |
|
|
|
00:34:26.599 --> 00:34:31.480 |
|
maybe another |
|
|
|
00:34:28.839 --> 00:34:33.040 |
|
example um so some things that we could |
|
|
|
00:34:31.480 --> 00:34:35.320 |
|
do or what we would have done before the |
|
|
|
00:34:33.040 --> 00:34:37.720 |
|
modern Paradigm of machine learning is |
|
|
|
00:34:35.320 --> 00:34:40.079 |
|
we would run some sort of normalizer |
|
|
|
00:34:37.720 --> 00:34:42.800 |
|
like a stemmer or other things like this |
|
|
|
00:34:40.079 --> 00:34:45.240 |
|
in order to convert this into uh the |
|
|
|
00:34:42.800 --> 00:34:48.599 |
|
root wordss that we already have seen |
|
|
|
00:34:45.240 --> 00:34:52.040 |
|
somewhere in our data or have already |
|
|
|
00:34:48.599 --> 00:34:54.040 |
|
handed so that requires um conjugation |
|
|
|
00:34:52.040 --> 00:34:55.879 |
|
analysis or morphological analysis as we |
|
|
|
00:34:54.040 --> 00:34:57.400 |
|
say it in |
|
|
|
00:34:55.879 --> 00:35:00.680 |
|
technicals |
|
|
|
00:34:57.400 --> 00:35:03.960 |
|
negation this is a tricky one so this |
|
|
|
00:35:00.680 --> 00:35:06.760 |
|
one's not nearly as Dreadful as expected |
|
|
|
00:35:03.960 --> 00:35:08.800 |
|
so Dreadful is a pretty bad word right |
|
|
|
00:35:06.760 --> 00:35:13.000 |
|
but not nearly as Dreadful as expected |
|
|
|
00:35:08.800 --> 00:35:14.440 |
|
is like a solidly neutral um you know or |
|
|
|
00:35:13.000 --> 00:35:16.359 |
|
maybe even |
|
|
|
00:35:14.440 --> 00:35:18.920 |
|
positive I would I would say that's |
|
|
|
00:35:16.359 --> 00:35:20.640 |
|
neutral but you know uh neutral or |
|
|
|
00:35:18.920 --> 00:35:23.800 |
|
positive it's definitely not |
|
|
|
00:35:20.640 --> 00:35:26.359 |
|
negative um serving s doesn't serve up a |
|
|
|
00:35:23.800 --> 00:35:29.480 |
|
whole lot of laughs so laughs is |
|
|
|
00:35:26.359 --> 00:35:31.880 |
|
obviously positive but not serving UPS |
|
|
|
00:35:29.480 --> 00:35:34.440 |
|
is obviously |
|
|
|
00:35:31.880 --> 00:35:36.839 |
|
negative so if negation modifies the |
|
|
|
00:35:34.440 --> 00:35:38.240 |
|
word disregard it now we would probably |
|
|
|
00:35:36.839 --> 00:35:41.440 |
|
need to do some sort of syntactic |
|
|
|
00:35:38.240 --> 00:35:45.599 |
|
analysis or semantic analysis of |
|
|
|
00:35:41.440 --> 00:35:47.520 |
|
some metaphor an analogy so puts a human |
|
|
|
00:35:45.599 --> 00:35:50.640 |
|
face on a land most westerners are |
|
|
|
00:35:47.520 --> 00:35:52.880 |
|
unfamiliar though uh this is |
|
|
|
00:35:50.640 --> 00:35:54.960 |
|
positive green might want to hang on to |
|
|
|
00:35:52.880 --> 00:35:58.800 |
|
that ski mask as robbery may be the only |
|
|
|
00:35:54.960 --> 00:35:58.800 |
|
way to pay for this next project |
|
|
|
00:35:58.839 --> 00:36:03.640 |
|
so this this is saying that the movie |
|
|
|
00:36:01.960 --> 00:36:05.560 |
|
was so bad that the director will have |
|
|
|
00:36:03.640 --> 00:36:08.359 |
|
to rob people in order to get money for |
|
|
|
00:36:05.560 --> 00:36:11.000 |
|
the next project so that's kind of bad I |
|
|
|
00:36:08.359 --> 00:36:12.880 |
|
guess um has all the depth of a waiting |
|
|
|
00:36:11.000 --> 00:36:14.520 |
|
pool this is kind of my favorite one |
|
|
|
00:36:12.880 --> 00:36:15.880 |
|
because it's really short and sweet but |
|
|
|
00:36:14.520 --> 00:36:18.800 |
|
you know you need to know how deep a |
|
|
|
00:36:15.880 --> 00:36:21.440 |
|
waiting pool is um so that's |
|
|
|
00:36:18.800 --> 00:36:22.960 |
|
negative so the solution here I don't |
|
|
|
00:36:21.440 --> 00:36:24.680 |
|
really even know how to handle this with |
|
|
|
00:36:22.960 --> 00:36:26.880 |
|
a rule based system I have no idea how |
|
|
|
00:36:24.680 --> 00:36:30.040 |
|
we would possibly do this yeah machine |
|
|
|
00:36:26.880 --> 00:36:32.400 |
|
learning based models seem to be pretty |
|
|
|
00:36:30.040 --> 00:36:37.000 |
|
adaptive okay and then I start doing |
|
|
|
00:36:32.400 --> 00:36:37.000 |
|
these ones um anyone have a good |
|
|
|
00:36:38.160 --> 00:36:46.800 |
|
idea any any other friends who know |
|
|
|
00:36:42.520 --> 00:36:50.040 |
|
Japanese no okay um so yeah that's |
|
|
|
00:36:46.800 --> 00:36:52.839 |
|
positive um that one's negative uh and |
|
|
|
00:36:50.040 --> 00:36:54.920 |
|
the solution here is learn Japanese I |
|
|
|
00:36:52.839 --> 00:36:56.800 |
|
guess or whatever other language you |
|
|
|
00:36:54.920 --> 00:37:00.040 |
|
want to process so like obviously |
|
|
|
00:36:56.800 --> 00:37:03.720 |
|
rule-based systems don't scale very |
|
|
|
00:37:00.040 --> 00:37:05.119 |
|
well so um we've moved but like rule |
|
|
|
00:37:03.720 --> 00:37:06.319 |
|
based systems don't scale very well |
|
|
|
00:37:05.119 --> 00:37:08.160 |
|
we're not going to be using them for |
|
|
|
00:37:06.319 --> 00:37:11.400 |
|
most of the things we do in this class |
|
|
|
00:37:08.160 --> 00:37:14.240 |
|
but I do think it's sometimes useful to |
|
|
|
00:37:11.400 --> 00:37:15.640 |
|
try to create one for your task maybe |
|
|
|
00:37:14.240 --> 00:37:16.680 |
|
right at the very beginning of a project |
|
|
|
00:37:15.640 --> 00:37:18.560 |
|
because it gives you an idea about |
|
|
|
00:37:16.680 --> 00:37:21.160 |
|
what's really hard about the task in |
|
|
|
00:37:18.560 --> 00:37:22.480 |
|
some cases so um yeah I wouldn't |
|
|
|
00:37:21.160 --> 00:37:25.599 |
|
entirely discount them I'm not |
|
|
|
00:37:22.480 --> 00:37:27.400 |
|
introducing them for no reason |
|
|
|
00:37:25.599 --> 00:37:29.880 |
|
whatsoever |
|
|
|
00:37:27.400 --> 00:37:34.160 |
|
so next is machine learning based anal |
|
|
|
00:37:29.880 --> 00:37:35.400 |
|
and machine learning uh in general uh I |
|
|
|
00:37:34.160 --> 00:37:36.640 |
|
here actually when I say machine |
|
|
|
00:37:35.400 --> 00:37:38.160 |
|
learning I'm going to be talking about |
|
|
|
00:37:36.640 --> 00:37:39.560 |
|
the traditional fine-tuning approach |
|
|
|
00:37:38.160 --> 00:37:43.520 |
|
where we have a training set Dev set |
|
|
|
00:37:39.560 --> 00:37:46.359 |
|
test set and so we take our training set |
|
|
|
00:37:43.520 --> 00:37:49.680 |
|
we run some learning algorithm over it |
|
|
|
00:37:46.359 --> 00:37:52.319 |
|
we have a learned feature extractor F A |
|
|
|
00:37:49.680 --> 00:37:55.839 |
|
possibly learned feature extractor F |
|
|
|
00:37:52.319 --> 00:37:57.880 |
|
possibly learned scoring function W and |
|
|
|
00:37:55.839 --> 00:38:00.800 |
|
uh then we apply our inference algorithm |
|
|
|
00:37:57.880 --> 00:38:02.839 |
|
our decision Rule and make decisions |
|
|
|
00:38:00.800 --> 00:38:04.200 |
|
when I say possibly learned actually the |
|
|
|
00:38:02.839 --> 00:38:06.119 |
|
first example I'm going to give of a |
|
|
|
00:38:04.200 --> 00:38:07.760 |
|
machine learning based technique is uh |
|
|
|
00:38:06.119 --> 00:38:10.079 |
|
doesn't have a learned feature extractor |
|
|
|
00:38:07.760 --> 00:38:12.800 |
|
but most things that we use nowadays do |
|
|
|
00:38:10.079 --> 00:38:12.800 |
|
have learned feature |
|
|
|
00:38:13.200 --> 00:38:18.040 |
|
extractors so our first attempt is going |
|
|
|
00:38:15.640 --> 00:38:21.760 |
|
to be a bag of words model uh and the |
|
|
|
00:38:18.040 --> 00:38:27.119 |
|
way a bag of wordss model works is uh |
|
|
|
00:38:21.760 --> 00:38:30.160 |
|
essentially we start out by looking up a |
|
|
|
00:38:27.119 --> 00:38:33.240 |
|
Vector where one element in the vector |
|
|
|
00:38:30.160 --> 00:38:36.240 |
|
is uh is one and all the other elements |
|
|
|
00:38:33.240 --> 00:38:38.040 |
|
in the vector are zero and so if the |
|
|
|
00:38:36.240 --> 00:38:40.319 |
|
word is different the position in the |
|
|
|
00:38:38.040 --> 00:38:42.839 |
|
vector that's one will be different we |
|
|
|
00:38:40.319 --> 00:38:46.280 |
|
add all of these together and this gives |
|
|
|
00:38:42.839 --> 00:38:48.200 |
|
us a vector where each element is the |
|
|
|
00:38:46.280 --> 00:38:50.359 |
|
frequency of that word in the vector and |
|
|
|
00:38:48.200 --> 00:38:52.520 |
|
then we multiply that by weights and we |
|
|
|
00:38:50.359 --> 00:38:55.520 |
|
get a |
|
|
|
00:38:52.520 --> 00:38:57.160 |
|
score and um here as I said this is not |
|
|
|
00:38:55.520 --> 00:39:00.359 |
|
a learned feature |
|
|
|
00:38:57.160 --> 00:39:02.079 |
|
uh Vector this is basically uh sorry not |
|
|
|
00:39:00.359 --> 00:39:04.359 |
|
a learn feature extractor this is |
|
|
|
00:39:02.079 --> 00:39:06.200 |
|
basically a fixed feature extractor but |
|
|
|
00:39:04.359 --> 00:39:09.839 |
|
the weights themselves are |
|
|
|
00:39:06.200 --> 00:39:11.640 |
|
learned um so my my question is I |
|
|
|
00:39:09.839 --> 00:39:14.599 |
|
mentioned a whole lot of problems before |
|
|
|
00:39:11.640 --> 00:39:17.480 |
|
I mentioned infrequent words I mentioned |
|
|
|
00:39:14.599 --> 00:39:20.760 |
|
conjugation I mentioned uh different |
|
|
|
00:39:17.480 --> 00:39:22.880 |
|
languages I mentioned syntax and |
|
|
|
00:39:20.760 --> 00:39:24.599 |
|
metaphor so which of these do we think |
|
|
|
00:39:22.880 --> 00:39:25.440 |
|
would be fixed by this sort of learning |
|
|
|
00:39:24.599 --> 00:39:27.400 |
|
based |
|
|
|
00:39:25.440 --> 00:39:29.640 |
|
approach |
|
|
|
00:39:27.400 --> 00:39:29.640 |
|
any |
|
|
|
00:39:29.920 --> 00:39:35.200 |
|
ideas maybe not fixed maybe made |
|
|
|
00:39:32.520 --> 00:39:35.200 |
|
significantly |
|
|
|
00:39:36.880 --> 00:39:41.560 |
|
better any Brave uh brave |
|
|
|
00:39:44.880 --> 00:39:48.440 |
|
people maybe maybe |
|
|
|
00:39:53.720 --> 00:39:58.400 |
|
negation okay so maybe doesn't when it |
|
|
|
00:39:55.760 --> 00:39:58.400 |
|
have a negative qu |
|
|
|
00:40:02.960 --> 00:40:07.560 |
|
yeah yeah so for the conjugation if we |
|
|
|
00:40:05.520 --> 00:40:09.200 |
|
had the conjugations of the stems mapped |
|
|
|
00:40:07.560 --> 00:40:11.119 |
|
in the same position that might fix a |
|
|
|
00:40:09.200 --> 00:40:12.920 |
|
conjugation problem but I would say if |
|
|
|
00:40:11.119 --> 00:40:15.200 |
|
you don't do that then this kind of |
|
|
|
00:40:12.920 --> 00:40:18.160 |
|
fixes conjugation a little bit but maybe |
|
|
|
00:40:15.200 --> 00:40:21.319 |
|
not not really yeah kind of fix |
|
|
|
00:40:18.160 --> 00:40:24.079 |
|
conjugation because like they're using |
|
|
|
00:40:21.319 --> 00:40:26.760 |
|
the same there |
|
|
|
00:40:24.079 --> 00:40:28.400 |
|
probably different variations so we |
|
|
|
00:40:26.760 --> 00:40:31.359 |
|
learn how to |
|
|
|
00:40:28.400 --> 00:40:33.400 |
|
classify surrounding |
|
|
|
00:40:31.359 --> 00:40:35.000 |
|
structure yeah if it's a big enough |
|
|
|
00:40:33.400 --> 00:40:36.760 |
|
training set you might have covered the |
|
|
|
00:40:35.000 --> 00:40:37.880 |
|
various conjugations but if you haven't |
|
|
|
00:40:36.760 --> 00:40:43.000 |
|
and you don't have any rule-based |
|
|
|
00:40:37.880 --> 00:40:43.000 |
|
processing it it might still be problems |
|
|
|
00:40:45.400 --> 00:40:50.359 |
|
yeah yeah so in frequent words if you |
|
|
|
00:40:48.280 --> 00:40:52.560 |
|
have a large enough training set yeah |
|
|
|
00:40:50.359 --> 00:40:54.599 |
|
you'll be able to fix it to some extent |
|
|
|
00:40:52.560 --> 00:40:56.480 |
|
so none of the problems are entirely |
|
|
|
00:40:54.599 --> 00:40:57.880 |
|
fixed but a lot of them are made better |
|
|
|
00:40:56.480 --> 00:40:58.960 |
|
different languages is also made better |
|
|
|
00:40:57.880 --> 00:41:00.119 |
|
if you have training data in that |
|
|
|
00:40:58.960 --> 00:41:04.599 |
|
language but if you don't then you're |
|
|
|
00:41:00.119 --> 00:41:06.240 |
|
out of BL so um so now what I'd like to |
|
|
|
00:41:04.599 --> 00:41:10.800 |
|
do is I'd look to like to look at what |
|
|
|
00:41:06.240 --> 00:41:15.079 |
|
our vectors represent so basically um in |
|
|
|
00:41:10.800 --> 00:41:16.880 |
|
uh in binary classification each word um |
|
|
|
00:41:15.079 --> 00:41:19.119 |
|
sorry so the vectors themselves |
|
|
|
00:41:16.880 --> 00:41:21.880 |
|
represent the counts of the words here |
|
|
|
00:41:19.119 --> 00:41:25.319 |
|
I'm talking about what the weight uh |
|
|
|
00:41:21.880 --> 00:41:28.520 |
|
vectors or matrices correspond to and |
|
|
|
00:41:25.319 --> 00:41:31.640 |
|
the weight uh Vector here will be |
|
|
|
00:41:28.520 --> 00:41:33.680 |
|
positive if the word it tends to be |
|
|
|
00:41:31.640 --> 00:41:36.680 |
|
positive if in a binary classification |
|
|
|
00:41:33.680 --> 00:41:38.400 |
|
case in a multiclass classification case |
|
|
|
00:41:36.680 --> 00:41:42.480 |
|
we'll actually have a matrix that looks |
|
|
|
00:41:38.400 --> 00:41:45.480 |
|
like this where um each column or row uh |
|
|
|
00:41:42.480 --> 00:41:47.079 |
|
corresponds to the word and each row or |
|
|
|
00:41:45.480 --> 00:41:49.319 |
|
column corresponds to a label and it |
|
|
|
00:41:47.079 --> 00:41:51.960 |
|
will be higher if that row tends to uh |
|
|
|
00:41:49.319 --> 00:41:54.800 |
|
correlate with that uh that word tends |
|
|
|
00:41:51.960 --> 00:41:56.920 |
|
to correlate that little |
|
|
|
00:41:54.800 --> 00:41:59.240 |
|
bit so |
|
|
|
00:41:56.920 --> 00:42:04.079 |
|
this um training of the bag of words |
|
|
|
00:41:59.240 --> 00:42:07.720 |
|
model is can be done uh so simply that |
|
|
|
00:42:04.079 --> 00:42:10.200 |
|
we uh can put it in a single slide so |
|
|
|
00:42:07.720 --> 00:42:11.599 |
|
basically here uh what we do is we start |
|
|
|
00:42:10.200 --> 00:42:14.760 |
|
out with the feature |
|
|
|
00:42:11.599 --> 00:42:18.880 |
|
weights and for each example in our data |
|
|
|
00:42:14.760 --> 00:42:20.800 |
|
set we extract features um the exact way |
|
|
|
00:42:18.880 --> 00:42:23.920 |
|
I'm extracting features is basically |
|
|
|
00:42:20.800 --> 00:42:25.720 |
|
splitting uh splitting the words using |
|
|
|
00:42:23.920 --> 00:42:28.000 |
|
the python split function and then uh |
|
|
|
00:42:25.720 --> 00:42:31.319 |
|
Counting number of times each word |
|
|
|
00:42:28.000 --> 00:42:33.160 |
|
exists uh we then run the classifier so |
|
|
|
00:42:31.319 --> 00:42:36.280 |
|
actually running the classifier is |
|
|
|
00:42:33.160 --> 00:42:38.200 |
|
exactly the same as what we did for the |
|
|
|
00:42:36.280 --> 00:42:42.640 |
|
uh the rule based system it's just that |
|
|
|
00:42:38.200 --> 00:42:47.359 |
|
we have feature vectors instead and |
|
|
|
00:42:42.640 --> 00:42:51.559 |
|
then if the predicted value is |
|
|
|
00:42:47.359 --> 00:42:55.160 |
|
not value then for each of the |
|
|
|
00:42:51.559 --> 00:42:56.680 |
|
features uh in the feature space we |
|
|
|
00:42:55.160 --> 00:43:02.200 |
|
upweight |
|
|
|
00:42:56.680 --> 00:43:03.599 |
|
the um we upweight The Weight by the |
|
|
|
00:43:02.200 --> 00:43:06.000 |
|
vector |
|
|
|
00:43:03.599 --> 00:43:09.920 |
|
size by or by the amount of the vector |
|
|
|
00:43:06.000 --> 00:43:13.240 |
|
if Y is positive and we downweight the |
|
|
|
00:43:09.920 --> 00:43:16.240 |
|
vector uh by the size of the vector if Y |
|
|
|
00:43:13.240 --> 00:43:18.520 |
|
is negative so this is really really |
|
|
|
00:43:16.240 --> 00:43:20.559 |
|
simple it's uh probably the simplest |
|
|
|
00:43:18.520 --> 00:43:25.079 |
|
possible algorithm for training one of |
|
|
|
00:43:20.559 --> 00:43:27.559 |
|
these models um but I have an |
|
|
|
00:43:25.079 --> 00:43:30.040 |
|
example in this that you can also take a |
|
|
|
00:43:27.559 --> 00:43:31.960 |
|
look at here's a trained bag of words |
|
|
|
00:43:30.040 --> 00:43:33.680 |
|
classifier and we could step through |
|
|
|
00:43:31.960 --> 00:43:34.960 |
|
this is on exactly the same data set as |
|
|
|
00:43:33.680 --> 00:43:37.240 |
|
I did before we're training on the |
|
|
|
00:43:34.960 --> 00:43:42.359 |
|
training set |
|
|
|
00:43:37.240 --> 00:43:43.640 |
|
um and uh evaluating on the dev set um I |
|
|
|
00:43:42.359 --> 00:43:45.880 |
|
also have some extra stuff like I'm |
|
|
|
00:43:43.640 --> 00:43:47.079 |
|
Shuffling the order of the data IDs |
|
|
|
00:43:45.880 --> 00:43:49.440 |
|
which is really important if you're |
|
|
|
00:43:47.079 --> 00:43:53.160 |
|
doing this sort of incremental algorithm |
|
|
|
00:43:49.440 --> 00:43:54.960 |
|
uh because uh what if what if your |
|
|
|
00:43:53.160 --> 00:43:57.400 |
|
creating data set was ordered in this |
|
|
|
00:43:54.960 --> 00:44:00.040 |
|
way where you have all of the positive |
|
|
|
00:43:57.400 --> 00:44:00.040 |
|
labels on |
|
|
|
00:44:00.359 --> 00:44:04.520 |
|
top and then you have all of the |
|
|
|
00:44:02.280 --> 00:44:06.680 |
|
negative labels on the |
|
|
|
00:44:04.520 --> 00:44:08.200 |
|
bottom if you do something like this it |
|
|
|
00:44:06.680 --> 00:44:10.200 |
|
would see only negative labels at the |
|
|
|
00:44:08.200 --> 00:44:11.800 |
|
end of training and you might have |
|
|
|
00:44:10.200 --> 00:44:14.400 |
|
problems because your model would only |
|
|
|
00:44:11.800 --> 00:44:17.440 |
|
predict negatives so we also Shuffle |
|
|
|
00:44:14.400 --> 00:44:20.319 |
|
data um and then step through we run the |
|
|
|
00:44:17.440 --> 00:44:22.559 |
|
classifier and I'm going to run uh five |
|
|
|
00:44:20.319 --> 00:44:23.640 |
|
epochs of training through the data set |
|
|
|
00:44:22.559 --> 00:44:27.160 |
|
uh very |
|
|
|
00:44:23.640 --> 00:44:29.599 |
|
fast and calculate our accuracy |
|
|
|
00:44:27.160 --> 00:44:33.280 |
|
and this got 75% accuracy on the |
|
|
|
00:44:29.599 --> 00:44:36.160 |
|
training data set and uh 56% accuracy on |
|
|
|
00:44:33.280 --> 00:44:40.000 |
|
the Deb data set so uh if you remember |
|
|
|
00:44:36.160 --> 00:44:41.520 |
|
our rule-based classifier had 42 uh 42 |
|
|
|
00:44:40.000 --> 00:44:43.880 |
|
accuracy and now our training based |
|
|
|
00:44:41.520 --> 00:44:45.760 |
|
classifier has 56 accuracy but it's |
|
|
|
00:44:43.880 --> 00:44:49.359 |
|
overfitting heavily to the training side |
|
|
|
00:44:45.760 --> 00:44:50.880 |
|
so um basically this is a pretty strong |
|
|
|
00:44:49.359 --> 00:44:53.480 |
|
advertisement for why we should be using |
|
|
|
00:44:50.880 --> 00:44:54.960 |
|
machine learning you know I the amount |
|
|
|
00:44:53.480 --> 00:44:57.800 |
|
of code that we had for this machine |
|
|
|
00:44:54.960 --> 00:44:59.720 |
|
learning model is basically very similar |
|
|
|
00:44:57.800 --> 00:45:02.680 |
|
um it's not using any external libraries |
|
|
|
00:44:59.720 --> 00:45:02.680 |
|
but we're getting better at |
|
|
|
00:45:03.599 --> 00:45:08.800 |
|
this |
|
|
|
00:45:05.800 --> 00:45:08.800 |
|
cool |
|
|
|
00:45:09.559 --> 00:45:16.000 |
|
so cool any any questions |
|
|
|
00:45:13.520 --> 00:45:18.240 |
|
here and so I'm going to talk about the |
|
|
|
00:45:16.000 --> 00:45:20.760 |
|
connection to between this algorithm and |
|
|
|
00:45:18.240 --> 00:45:22.839 |
|
neural networks in the next class um |
|
|
|
00:45:20.760 --> 00:45:24.200 |
|
because this actually is using a very |
|
|
|
00:45:22.839 --> 00:45:26.319 |
|
similar training algorithm to what we |
|
|
|
00:45:24.200 --> 00:45:27.480 |
|
use in neural networks with some uh |
|
|
|
00:45:26.319 --> 00:45:30.079 |
|
particular |
|
|
|
00:45:27.480 --> 00:45:32.839 |
|
assumptions cool um so what's missing in |
|
|
|
00:45:30.079 --> 00:45:34.800 |
|
bag of words um still handling of |
|
|
|
00:45:32.839 --> 00:45:36.880 |
|
conjugation or compound words is not |
|
|
|
00:45:34.800 --> 00:45:39.160 |
|
perfect it we can do it to some extent |
|
|
|
00:45:36.880 --> 00:45:41.079 |
|
to the point where we can uh memorize |
|
|
|
00:45:39.160 --> 00:45:44.079 |
|
things so I love this movie I love this |
|
|
|
00:45:41.079 --> 00:45:46.920 |
|
movie another thing is handling word Ser |
|
|
|
00:45:44.079 --> 00:45:49.240 |
|
uh similarities so I love this movie and |
|
|
|
00:45:46.920 --> 00:45:50.720 |
|
I adore this movie uh these basically |
|
|
|
00:45:49.240 --> 00:45:52.119 |
|
mean the same thing as humans we know |
|
|
|
00:45:50.720 --> 00:45:54.200 |
|
they mean the same thing so we should be |
|
|
|
00:45:52.119 --> 00:45:56.079 |
|
able to take advantage of that fact to |
|
|
|
00:45:54.200 --> 00:45:57.839 |
|
learn better models but we're not doing |
|
|
|
00:45:56.079 --> 00:46:02.760 |
|
that in this model at the moment because |
|
|
|
00:45:57.839 --> 00:46:05.440 |
|
each unit is uh treated as a atomic unit |
|
|
|
00:46:02.760 --> 00:46:08.040 |
|
and there's no idea of |
|
|
|
00:46:05.440 --> 00:46:11.040 |
|
similarity also handling of combination |
|
|
|
00:46:08.040 --> 00:46:12.760 |
|
features so um I love this movie and I |
|
|
|
00:46:11.040 --> 00:46:14.920 |
|
don't love this movie I hate this movie |
|
|
|
00:46:12.760 --> 00:46:17.079 |
|
and I don't hate this movie actually |
|
|
|
00:46:14.920 --> 00:46:20.400 |
|
this is a little bit tricky because |
|
|
|
00:46:17.079 --> 00:46:23.240 |
|
negative words are slightly indicative |
|
|
|
00:46:20.400 --> 00:46:25.280 |
|
of it being negative but actually what |
|
|
|
00:46:23.240 --> 00:46:28.119 |
|
they do is they negate the other things |
|
|
|
00:46:25.280 --> 00:46:28.119 |
|
that you're saying in the |
|
|
|
00:46:28.240 --> 00:46:36.559 |
|
sentence |
|
|
|
00:46:30.720 --> 00:46:40.480 |
|
so um like love is positive hate is |
|
|
|
00:46:36.559 --> 00:46:40.480 |
|
negative but like don't |
|
|
|
00:46:50.359 --> 00:46:56.079 |
|
love it's actually kind of like this |
|
|
|
00:46:52.839 --> 00:46:59.359 |
|
right like um Love is very positive POS |
|
|
|
00:46:56.079 --> 00:47:01.760 |
|
hate is very negative but don't love is |
|
|
|
00:46:59.359 --> 00:47:04.680 |
|
like slightly less positive than don't |
|
|
|
00:47:01.760 --> 00:47:06.160 |
|
hate right so um It's actually kind of |
|
|
|
00:47:04.680 --> 00:47:07.559 |
|
tricky because you need to combine them |
|
|
|
00:47:06.160 --> 00:47:10.720 |
|
together and figure out what's going on |
|
|
|
00:47:07.559 --> 00:47:12.280 |
|
based on that another example that a lot |
|
|
|
00:47:10.720 --> 00:47:14.160 |
|
of people might not think of immediately |
|
|
|
00:47:12.280 --> 00:47:17.880 |
|
but is super super common in sentiment |
|
|
|
00:47:14.160 --> 00:47:20.160 |
|
analysis or any other thing is butt so |
|
|
|
00:47:17.880 --> 00:47:22.599 |
|
basically what but does is it throws |
|
|
|
00:47:20.160 --> 00:47:24.160 |
|
away all the stuff that you said before |
|
|
|
00:47:22.599 --> 00:47:26.119 |
|
um and you can just pay attention to the |
|
|
|
00:47:24.160 --> 00:47:29.000 |
|
stuff that you saw beforehand so like we |
|
|
|
00:47:26.119 --> 00:47:30.440 |
|
could even add this to our um like if |
|
|
|
00:47:29.000 --> 00:47:31.760 |
|
you want to add this to your rule based |
|
|
|
00:47:30.440 --> 00:47:33.240 |
|
classifier you can do that you just |
|
|
|
00:47:31.760 --> 00:47:34.640 |
|
search for butt and delete everything |
|
|
|
00:47:33.240 --> 00:47:37.240 |
|
before it and see if that inputs your |
|
|
|
00:47:34.640 --> 00:47:39.240 |
|
accuracy might be might be a fun very |
|
|
|
00:47:37.240 --> 00:47:43.480 |
|
quick thing |
|
|
|
00:47:39.240 --> 00:47:44.880 |
|
to cool so the better solution which is |
|
|
|
00:47:43.480 --> 00:47:46.800 |
|
what we're going to talk about for every |
|
|
|
00:47:44.880 --> 00:47:49.480 |
|
other class other than uh other than |
|
|
|
00:47:46.800 --> 00:47:52.160 |
|
this one is neural network models and |
|
|
|
00:47:49.480 --> 00:47:55.800 |
|
basically uh what they do is they do a |
|
|
|
00:47:52.160 --> 00:47:59.400 |
|
lookup of uh dense word embeddings so |
|
|
|
00:47:55.800 --> 00:48:02.520 |
|
instead of looking up uh individual uh |
|
|
|
00:47:59.400 --> 00:48:04.640 |
|
sparse uh vectors individual one hot |
|
|
|
00:48:02.520 --> 00:48:06.920 |
|
vectors they look up dense word |
|
|
|
00:48:04.640 --> 00:48:09.680 |
|
embeddings and then throw them into some |
|
|
|
00:48:06.920 --> 00:48:11.880 |
|
complicated function to extract features |
|
|
|
00:48:09.680 --> 00:48:16.359 |
|
and based on the features uh multiply by |
|
|
|
00:48:11.880 --> 00:48:18.280 |
|
weights and get a score um and if you're |
|
|
|
00:48:16.359 --> 00:48:20.359 |
|
doing text classification in the |
|
|
|
00:48:18.280 --> 00:48:22.520 |
|
traditional way this is normally what |
|
|
|
00:48:20.359 --> 00:48:23.760 |
|
you do um if you're doing text |
|
|
|
00:48:22.520 --> 00:48:25.960 |
|
classification with something like |
|
|
|
00:48:23.760 --> 00:48:27.280 |
|
prompting you're still actually doing |
|
|
|
00:48:25.960 --> 00:48:29.960 |
|
this because you're calculating the |
|
|
|
00:48:27.280 --> 00:48:32.960 |
|
score of the next word to predict and |
|
|
|
00:48:29.960 --> 00:48:34.720 |
|
that's done in exactly the same way so |
|
|
|
00:48:32.960 --> 00:48:37.760 |
|
uh even if you're using a large language |
|
|
|
00:48:34.720 --> 00:48:39.359 |
|
model like GPT this is still probably |
|
|
|
00:48:37.760 --> 00:48:41.800 |
|
happening under the hood unless open the |
|
|
|
00:48:39.359 --> 00:48:43.400 |
|
eye invented something that very |
|
|
|
00:48:41.800 --> 00:48:45.559 |
|
different in Alien than anything else |
|
|
|
00:48:43.400 --> 00:48:48.440 |
|
that we know of but I I'm guessing that |
|
|
|
00:48:45.559 --> 00:48:48.440 |
|
that propably hasn't |
|
|
|
00:48:48.480 --> 00:48:52.880 |
|
happen um one nice thing about neural |
|
|
|
00:48:50.880 --> 00:48:54.480 |
|
networks is neural networks |
|
|
|
00:48:52.880 --> 00:48:57.559 |
|
theoretically are powerful enough to |
|
|
|
00:48:54.480 --> 00:49:00.000 |
|
solve any task if you make them uh deep |
|
|
|
00:48:57.559 --> 00:49:01.160 |
|
enough or wide enough uh like if you |
|
|
|
00:49:00.000 --> 00:49:04.520 |
|
make them wide enough and then if you |
|
|
|
00:49:01.160 --> 00:49:06.799 |
|
make them deep it also helps further so |
|
|
|
00:49:04.520 --> 00:49:08.079 |
|
anytime somebody says well you can't |
|
|
|
00:49:06.799 --> 00:49:11.119 |
|
just solve that problem with neural |
|
|
|
00:49:08.079 --> 00:49:13.240 |
|
networks you know that they're lying |
|
|
|
00:49:11.119 --> 00:49:15.720 |
|
basically because they theoretically can |
|
|
|
00:49:13.240 --> 00:49:17.359 |
|
solve every problem uh but you have you |
|
|
|
00:49:15.720 --> 00:49:19.799 |
|
have issues of data you have issues of |
|
|
|
00:49:17.359 --> 00:49:23.079 |
|
other things like that so you know they |
|
|
|
00:49:19.799 --> 00:49:23.079 |
|
don't just necessarily work |
|
|
|
00:49:23.119 --> 00:49:28.040 |
|
outs cool um so the final thing I'd like |
|
|
|
00:49:26.400 --> 00:49:29.319 |
|
to talk about is the road map going |
|
|
|
00:49:28.040 --> 00:49:31.319 |
|
forward some of the things I'm going to |
|
|
|
00:49:29.319 --> 00:49:32.799 |
|
cover in the class and some of the |
|
|
|
00:49:31.319 --> 00:49:35.200 |
|
logistics |
|
|
|
00:49:32.799 --> 00:49:36.799 |
|
issues so um the first thing I'm going |
|
|
|
00:49:35.200 --> 00:49:38.240 |
|
to talk about in the class is language |
|
|
|
00:49:36.799 --> 00:49:40.559 |
|
modeling fun |
|
|
|
00:49:38.240 --> 00:49:42.720 |
|
fundamentals and uh so this could |
|
|
|
00:49:40.559 --> 00:49:44.240 |
|
include language models uh that just |
|
|
|
00:49:42.720 --> 00:49:46.559 |
|
predict the next words it could include |
|
|
|
00:49:44.240 --> 00:49:50.559 |
|
language models that predict the output |
|
|
|
00:49:46.559 --> 00:49:51.599 |
|
given the uh the input or the prompt um |
|
|
|
00:49:50.559 --> 00:49:54.559 |
|
I'm going to be talking about |
|
|
|
00:49:51.599 --> 00:49:56.520 |
|
representing words uh how how we get |
|
|
|
00:49:54.559 --> 00:49:59.319 |
|
word representation subword models other |
|
|
|
00:49:56.520 --> 00:50:01.440 |
|
things like that uh then go kind of |
|
|
|
00:49:59.319 --> 00:50:04.200 |
|
deeper into language modeling uh how do |
|
|
|
00:50:01.440 --> 00:50:07.799 |
|
we do it how do we evaluate it other |
|
|
|
00:50:04.200 --> 00:50:10.920 |
|
things um sequence encoding uh and this |
|
|
|
00:50:07.799 --> 00:50:13.240 |
|
is going to cover things like uh |
|
|
|
00:50:10.920 --> 00:50:16.280 |
|
Transformers uh self attention modals |
|
|
|
00:50:13.240 --> 00:50:18.559 |
|
but also very quickly cnns and rnns |
|
|
|
00:50:16.280 --> 00:50:20.880 |
|
which are useful in some |
|
|
|
00:50:18.559 --> 00:50:22.200 |
|
cases um and then we're going to |
|
|
|
00:50:20.880 --> 00:50:24.040 |
|
specifically go very deep into the |
|
|
|
00:50:22.200 --> 00:50:25.960 |
|
Transformer architecture and also talk a |
|
|
|
00:50:24.040 --> 00:50:27.280 |
|
little bit about some of the modern uh |
|
|
|
00:50:25.960 --> 00:50:30.240 |
|
improvements to the Transformer |
|
|
|
00:50:27.280 --> 00:50:31.839 |
|
architecture so the Transformer we're |
|
|
|
00:50:30.240 --> 00:50:33.839 |
|
using nowadays is very different than |
|
|
|
00:50:31.839 --> 00:50:36.200 |
|
the Transformer that was invented in |
|
|
|
00:50:33.839 --> 00:50:37.240 |
|
2017 uh so we're going to talk well I |
|
|
|
00:50:36.200 --> 00:50:38.760 |
|
wouldn't say very different but |
|
|
|
00:50:37.240 --> 00:50:41.359 |
|
different enough that it's important so |
|
|
|
00:50:38.760 --> 00:50:43.280 |
|
we're going to talk about some of those |
|
|
|
00:50:41.359 --> 00:50:45.079 |
|
things second thing I'd like to talk |
|
|
|
00:50:43.280 --> 00:50:47.000 |
|
about is training and inference methods |
|
|
|
00:50:45.079 --> 00:50:48.839 |
|
so this includes uh generation |
|
|
|
00:50:47.000 --> 00:50:52.119 |
|
algorithms uh so we're going to have a |
|
|
|
00:50:48.839 --> 00:50:55.520 |
|
whole class on how we generate text uh |
|
|
|
00:50:52.119 --> 00:50:58.319 |
|
in different ways uh prompting how uh we |
|
|
|
00:50:55.520 --> 00:50:59.720 |
|
can prompt things I hear uh world class |
|
|
|
00:50:58.319 --> 00:51:01.799 |
|
prompt engineers make a lot of money |
|
|
|
00:50:59.720 --> 00:51:05.480 |
|
nowadays so uh you'll want to pay |
|
|
|
00:51:01.799 --> 00:51:08.760 |
|
attention to that one um and instruction |
|
|
|
00:51:05.480 --> 00:51:11.520 |
|
tuning uh so how do we train models to |
|
|
|
00:51:08.760 --> 00:51:13.720 |
|
handle a lot of different tasks and |
|
|
|
00:51:11.520 --> 00:51:15.839 |
|
reinforcement learning so how do we uh |
|
|
|
00:51:13.720 --> 00:51:18.520 |
|
you know like actually generate outputs |
|
|
|
00:51:15.839 --> 00:51:19.839 |
|
uh kind of Judge them and then learn |
|
|
|
00:51:18.520 --> 00:51:22.599 |
|
from |
|
|
|
00:51:19.839 --> 00:51:25.880 |
|
there also experimental design and |
|
|
|
00:51:22.599 --> 00:51:28.079 |
|
evaluation so experimental design uh so |
|
|
|
00:51:25.880 --> 00:51:30.480 |
|
how do we design an experiment well uh |
|
|
|
00:51:28.079 --> 00:51:32.000 |
|
so that it backs up what we want to be |
|
|
|
00:51:30.480 --> 00:51:34.559 |
|
uh our conclusions that we want to be |
|
|
|
00:51:32.000 --> 00:51:37.000 |
|
backing up how do we do human annotation |
|
|
|
00:51:34.559 --> 00:51:38.880 |
|
of data in a reliable way this is |
|
|
|
00:51:37.000 --> 00:51:41.160 |
|
getting harder and harder as models get |
|
|
|
00:51:38.880 --> 00:51:43.359 |
|
better and better because uh getting |
|
|
|
00:51:41.160 --> 00:51:45.000 |
|
humans who don't care very much about |
|
|
|
00:51:43.359 --> 00:51:48.559 |
|
The annotation task they might do worse |
|
|
|
00:51:45.000 --> 00:51:51.119 |
|
than gp4 so um you need to be careful of |
|
|
|
00:51:48.559 --> 00:51:52.240 |
|
that also debugging and interpretation |
|
|
|
00:51:51.119 --> 00:51:53.960 |
|
technique so what are some of the |
|
|
|
00:51:52.240 --> 00:51:55.160 |
|
automatic techniques that you can do to |
|
|
|
00:51:53.960 --> 00:51:57.720 |
|
quickly figure out what's going wrong |
|
|
|
00:51:55.160 --> 00:52:00.040 |
|
with your models and improve |
|
|
|
00:51:57.720 --> 00:52:01.599 |
|
them and uh bias and fairness |
|
|
|
00:52:00.040 --> 00:52:04.200 |
|
considerations so it's really really |
|
|
|
00:52:01.599 --> 00:52:05.799 |
|
important nowadays uh that models are |
|
|
|
00:52:04.200 --> 00:52:07.880 |
|
being deployed to real people in the |
|
|
|
00:52:05.799 --> 00:52:09.880 |
|
real world and like actually causing |
|
|
|
00:52:07.880 --> 00:52:11.760 |
|
harm to people in some cases that we |
|
|
|
00:52:09.880 --> 00:52:15.160 |
|
need to be worried about |
|
|
|
00:52:11.760 --> 00:52:17.000 |
|
that Advanced Training in architectures |
|
|
|
00:52:15.160 --> 00:52:19.280 |
|
so we're going to talk about distill |
|
|
|
00:52:17.000 --> 00:52:21.400 |
|
distillation and quantization how can we |
|
|
|
00:52:19.280 --> 00:52:23.520 |
|
make small language models uh that |
|
|
|
00:52:21.400 --> 00:52:24.880 |
|
actually still work well like not large |
|
|
|
00:52:23.520 --> 00:52:27.559 |
|
you can run them on your phone you can |
|
|
|
00:52:24.880 --> 00:52:29.920 |
|
run them on your local |
|
|
|
00:52:27.559 --> 00:52:31.640 |
|
laptop um ensembling and mixtures of |
|
|
|
00:52:29.920 --> 00:52:33.480 |
|
experts how can we combine together |
|
|
|
00:52:31.640 --> 00:52:34.760 |
|
multiple models in order to create |
|
|
|
00:52:33.480 --> 00:52:35.880 |
|
models that are better than the sum of |
|
|
|
00:52:34.760 --> 00:52:38.799 |
|
their |
|
|
|
00:52:35.880 --> 00:52:40.720 |
|
parts and um retrieval and retrieval |
|
|
|
00:52:38.799 --> 00:52:43.920 |
|
augmented |
|
|
|
00:52:40.720 --> 00:52:45.480 |
|
generation long sequence models uh so |
|
|
|
00:52:43.920 --> 00:52:49.920 |
|
how do we handle long |
|
|
|
00:52:45.480 --> 00:52:52.240 |
|
outputs um and uh we're going to talk |
|
|
|
00:52:49.920 --> 00:52:55.760 |
|
about applications to complex reasoning |
|
|
|
00:52:52.240 --> 00:52:57.760 |
|
tasks code generation language agents |
|
|
|
00:52:55.760 --> 00:52:59.920 |
|
and knowledge-based QA and information |
|
|
|
00:52:57.760 --> 00:53:04.160 |
|
extraction I picked |
|
|
|
00:52:59.920 --> 00:53:06.760 |
|
these because they seem to be maybe the |
|
|
|
00:53:04.160 --> 00:53:09.880 |
|
most important at least in research |
|
|
|
00:53:06.760 --> 00:53:11.440 |
|
nowadays and also they cover uh the |
|
|
|
00:53:09.880 --> 00:53:13.640 |
|
things that when I talk to people in |
|
|
|
00:53:11.440 --> 00:53:15.280 |
|
Industry are kind of most interested in |
|
|
|
00:53:13.640 --> 00:53:17.559 |
|
so hopefully it'll be useful regardless |
|
|
|
00:53:15.280 --> 00:53:19.799 |
|
of uh whether you plan on doing research |
|
|
|
00:53:17.559 --> 00:53:22.839 |
|
or or plan on doing industry related |
|
|
|
00:53:19.799 --> 00:53:24.160 |
|
things uh by by the way the two things |
|
|
|
00:53:22.839 --> 00:53:25.920 |
|
that when I talk to people in Industry |
|
|
|
00:53:24.160 --> 00:53:29.599 |
|
they're most interested in are Rag and |
|
|
|
00:53:25.920 --> 00:53:31.079 |
|
code generation at the moment for now um |
|
|
|
00:53:29.599 --> 00:53:32.319 |
|
so those are ones that you'll want to |
|
|
|
00:53:31.079 --> 00:53:34.680 |
|
pay attention |
|
|
|
00:53:32.319 --> 00:53:36.599 |
|
to and then finally we have a few |
|
|
|
00:53:34.680 --> 00:53:40.079 |
|
lectures on Linguistics and |
|
|
|
00:53:36.599 --> 00:53:42.720 |
|
multilinguality um I love Linguistics |
|
|
|
00:53:40.079 --> 00:53:44.839 |
|
but uh to be honest at the moment most |
|
|
|
00:53:42.720 --> 00:53:47.760 |
|
of our Cutting Edge models don't |
|
|
|
00:53:44.839 --> 00:53:49.240 |
|
explicitly use linguistic structure um |
|
|
|
00:53:47.760 --> 00:53:50.799 |
|
but I still think it's useful to know |
|
|
|
00:53:49.240 --> 00:53:52.760 |
|
about it especially if you're working on |
|
|
|
00:53:50.799 --> 00:53:54.880 |
|
multilingual things especially if you're |
|
|
|
00:53:52.760 --> 00:53:57.040 |
|
interested in very robust generalization |
|
|
|
00:53:54.880 --> 00:53:58.920 |
|
to new models so we're going to talk a |
|
|
|
00:53:57.040 --> 00:54:02.599 |
|
little bit about that and also |
|
|
|
00:53:58.920 --> 00:54:06.079 |
|
multilingual LP I'm going to have |
|
|
|
00:54:02.599 --> 00:54:09.119 |
|
fure so also if you have any suggestions |
|
|
|
00:54:06.079 --> 00:54:11.400 |
|
um we have two guest lecture slots still |
|
|
|
00:54:09.119 --> 00:54:12.799 |
|
open uh that I'm trying to fill so if |
|
|
|
00:54:11.400 --> 00:54:15.440 |
|
you have any things that you really want |
|
|
|
00:54:12.799 --> 00:54:16.440 |
|
to hear about um I could either add them |
|
|
|
00:54:15.440 --> 00:54:19.319 |
|
to the |
|
|
|
00:54:16.440 --> 00:54:21.079 |
|
existing you know content or I could |
|
|
|
00:54:19.319 --> 00:54:23.240 |
|
invite a guest lecturer who's working on |
|
|
|
00:54:21.079 --> 00:54:24.079 |
|
that topic so you know please feel free |
|
|
|
00:54:23.240 --> 00:54:26.760 |
|
to tell |
|
|
|
00:54:24.079 --> 00:54:29.160 |
|
me um then the class format and |
|
|
|
00:54:26.760 --> 00:54:32.280 |
|
structure uh the class |
|
|
|
00:54:29.160 --> 00:54:34.000 |
|
content my goal is to learn in detail |
|
|
|
00:54:32.280 --> 00:54:36.640 |
|
about building NLP systems from a |
|
|
|
00:54:34.000 --> 00:54:40.520 |
|
research perspective so this is a 700 |
|
|
|
00:54:36.640 --> 00:54:43.599 |
|
level course so it's aiming to be for |
|
|
|
00:54:40.520 --> 00:54:46.960 |
|
people who really want to try new and |
|
|
|
00:54:43.599 --> 00:54:49.280 |
|
Innovative things in uh kind of natural |
|
|
|
00:54:46.960 --> 00:54:51.359 |
|
language processing it's not going to |
|
|
|
00:54:49.280 --> 00:54:52.760 |
|
focus solely on reimplementing things |
|
|
|
00:54:51.359 --> 00:54:54.319 |
|
that have been done before including in |
|
|
|
00:54:52.760 --> 00:54:55.280 |
|
the project I'm going to be expecting |
|
|
|
00:54:54.319 --> 00:54:58.480 |
|
everybody to do something something |
|
|
|
00:54:55.280 --> 00:54:59.920 |
|
that's kind of new whether it's coming |
|
|
|
00:54:58.480 --> 00:55:01.359 |
|
up with a new method or applying |
|
|
|
00:54:59.920 --> 00:55:03.559 |
|
existing methods to a place where they |
|
|
|
00:55:01.359 --> 00:55:05.079 |
|
haven't been used before or building out |
|
|
|
00:55:03.559 --> 00:55:06.640 |
|
things for a new language or something |
|
|
|
00:55:05.079 --> 00:55:08.359 |
|
like that so that's kind of one of the |
|
|
|
00:55:06.640 --> 00:55:11.480 |
|
major goals of this |
|
|
|
00:55:08.359 --> 00:55:13.000 |
|
class um learn basic and advanced topics |
|
|
|
00:55:11.480 --> 00:55:15.559 |
|
in machine learning approaches to NLP |
|
|
|
00:55:13.000 --> 00:55:18.359 |
|
and language models learn some basic |
|
|
|
00:55:15.559 --> 00:55:21.480 |
|
linguistic knowledge useful in NLP uh |
|
|
|
00:55:18.359 --> 00:55:23.200 |
|
see case studies of NLP applications and |
|
|
|
00:55:21.480 --> 00:55:25.680 |
|
learn how to identify unique problems |
|
|
|
00:55:23.200 --> 00:55:29.039 |
|
for each um one thing i' like to point |
|
|
|
00:55:25.680 --> 00:55:31.160 |
|
out is I'm not going to cover every NLP |
|
|
|
00:55:29.039 --> 00:55:32.920 |
|
application ever because that would be |
|
|
|
00:55:31.160 --> 00:55:35.520 |
|
absolutely impossible NLP is being used |
|
|
|
00:55:32.920 --> 00:55:37.079 |
|
in so many different areas nowadays but |
|
|
|
00:55:35.520 --> 00:55:38.960 |
|
what I want people to pay attention to |
|
|
|
00:55:37.079 --> 00:55:41.280 |
|
like even if you're not super interested |
|
|
|
00:55:38.960 --> 00:55:42.400 |
|
in code generation for example what you |
|
|
|
00:55:41.280 --> 00:55:44.200 |
|
can do is you can look at code |
|
|
|
00:55:42.400 --> 00:55:46.160 |
|
generation look at how people identify |
|
|
|
00:55:44.200 --> 00:55:47.680 |
|
problems look at the methods that people |
|
|
|
00:55:46.160 --> 00:55:50.880 |
|
have proposed to solve those unique |
|
|
|
00:55:47.680 --> 00:55:53.039 |
|
problems and then kind of map that try |
|
|
|
00:55:50.880 --> 00:55:54.799 |
|
to do some generalization onto your own |
|
|
|
00:55:53.039 --> 00:55:57.799 |
|
problems of Interest so uh that's kind |
|
|
|
00:55:54.799 --> 00:56:00.280 |
|
of the goal of the NLP |
|
|
|
00:55:57.799 --> 00:56:02.440 |
|
applications finally uh learning how to |
|
|
|
00:56:00.280 --> 00:56:05.160 |
|
debug when and where NLP systems fail |
|
|
|
00:56:02.440 --> 00:56:08.200 |
|
and build improvements based on this so |
|
|
|
00:56:05.160 --> 00:56:10.200 |
|
um ever since I was a graduate student |
|
|
|
00:56:08.200 --> 00:56:12.720 |
|
this has been like one of the really |
|
|
|
00:56:10.200 --> 00:56:15.920 |
|
important things that I feel like I've |
|
|
|
00:56:12.720 --> 00:56:17.440 |
|
done well or done better than some other |
|
|
|
00:56:15.920 --> 00:56:19.280 |
|
people and I I feel like it's a really |
|
|
|
00:56:17.440 --> 00:56:21.119 |
|
good way to like even if you're only |
|
|
|
00:56:19.280 --> 00:56:22.680 |
|
interested in improving accuracy knowing |
|
|
|
00:56:21.119 --> 00:56:25.039 |
|
why your system's failing still is the |
|
|
|
00:56:22.680 --> 00:56:27.599 |
|
best way to do that I so I'm going to |
|
|
|
00:56:25.039 --> 00:56:30.559 |
|
put a lot of emphasis on |
|
|
|
00:56:27.599 --> 00:56:32.559 |
|
that in terms of the class format um |
|
|
|
00:56:30.559 --> 00:56:36.280 |
|
before class for some classes there are |
|
|
|
00:56:32.559 --> 00:56:37.880 |
|
recommended reading uh this can be |
|
|
|
00:56:36.280 --> 00:56:39.559 |
|
helpful to read I'm never going to |
|
|
|
00:56:37.880 --> 00:56:41.119 |
|
expect you to definitely have read it |
|
|
|
00:56:39.559 --> 00:56:42.480 |
|
before the class but I would suggest |
|
|
|
00:56:41.119 --> 00:56:45.160 |
|
that maybe you'll get more out of the |
|
|
|
00:56:42.480 --> 00:56:47.319 |
|
class if you do that um during class |
|
|
|
00:56:45.160 --> 00:56:48.079 |
|
we'll have the lecture um in discussion |
|
|
|
00:56:47.319 --> 00:56:50.559 |
|
with |
|
|
|
00:56:48.079 --> 00:56:52.359 |
|
everybody um sometimes we'll have a code |
|
|
|
00:56:50.559 --> 00:56:55.839 |
|
or data walk |
|
|
|
00:56:52.359 --> 00:56:58.760 |
|
um actually this is a a little bit old I |
|
|
|
00:56:55.839 --> 00:57:01.880 |
|
I have this slide we're this year we're |
|
|
|
00:56:58.760 --> 00:57:04.160 |
|
going to be adding more uh code and data |
|
|
|
00:57:01.880 --> 00:57:07.400 |
|
walks during office hours and the way it |
|
|
|
00:57:04.160 --> 00:57:09.400 |
|
will work is one of the Tas we have |
|
|
|
00:57:07.400 --> 00:57:11.160 |
|
seven Tas who I'm going to introduce |
|
|
|
00:57:09.400 --> 00:57:15.000 |
|
very soon but one of the Tas will be |
|
|
|
00:57:11.160 --> 00:57:16.839 |
|
doing this kind of recitation where you |
|
|
|
00:57:15.000 --> 00:57:18.200 |
|
um where we go over a library so if |
|
|
|
00:57:16.839 --> 00:57:19.480 |
|
you're not familiar with the library and |
|
|
|
00:57:18.200 --> 00:57:21.960 |
|
you want to be more familiar with the |
|
|
|
00:57:19.480 --> 00:57:23.720 |
|
library you can join this and uh then |
|
|
|
00:57:21.960 --> 00:57:25.400 |
|
we'll be able to do this and this will |
|
|
|
00:57:23.720 --> 00:57:28.240 |
|
cover things like |
|
|
|
00:57:25.400 --> 00:57:31.039 |
|
um pie torch and sentence piece uh we're |
|
|
|
00:57:28.240 --> 00:57:33.280 |
|
going to start out with hugging face um |
|
|
|
00:57:31.039 --> 00:57:36.559 |
|
inference stuff like |
|
|
|
00:57:33.280 --> 00:57:41.520 |
|
VM uh debugging software like |
|
|
|
00:57:36.559 --> 00:57:41.520 |
|
Xeno um what were the other |
|
|
|
00:57:41.960 --> 00:57:47.200 |
|
ones oh the open AI API and light llm |
|
|
|
00:57:45.680 --> 00:57:50.520 |
|
other stuff like that so we we have lots |
|
|
|
00:57:47.200 --> 00:57:53.599 |
|
of them planned we'll uh uh we'll update |
|
|
|
00:57:50.520 --> 00:57:54.839 |
|
that um and then after class after |
|
|
|
00:57:53.599 --> 00:57:58.079 |
|
almost every class we'll have a question |
|
|
|
00:57:54.839 --> 00:58:00.079 |
|
quiz um and the quiz is intended to just |
|
|
|
00:57:58.079 --> 00:58:02.000 |
|
you know make sure that you uh paid |
|
|
|
00:58:00.079 --> 00:58:04.480 |
|
attention to the material and are able |
|
|
|
00:58:02.000 --> 00:58:07.520 |
|
to answer questions about it we will aim |
|
|
|
00:58:04.480 --> 00:58:09.559 |
|
to release it on the day of the course |
|
|
|
00:58:07.520 --> 00:58:11.599 |
|
the day of the actual lecture and it |
|
|
|
00:58:09.559 --> 00:58:14.559 |
|
will be due at the end of the following |
|
|
|
00:58:11.599 --> 00:58:15.960 |
|
day of the lecture so um it will be |
|
|
|
00:58:14.559 --> 00:58:18.920 |
|
three questions it probably shouldn't |
|
|
|
00:58:15.960 --> 00:58:20.680 |
|
take a whole lot of time but um uh yeah |
|
|
|
00:58:18.920 --> 00:58:23.400 |
|
so we'll H |
|
|
|
00:58:20.680 --> 00:58:26.319 |
|
that in terms of assignments assignment |
|
|
|
00:58:23.400 --> 00:58:28.640 |
|
one is going to be build your own llama |
|
|
|
00:58:26.319 --> 00:58:30.200 |
|
and so what this is going to look like |
|
|
|
00:58:28.640 --> 00:58:32.680 |
|
is we're going to give you a partial |
|
|
|
00:58:30.200 --> 00:58:34.319 |
|
implementation of llama which is kind of |
|
|
|
00:58:32.680 --> 00:58:37.960 |
|
the most popular open source language |
|
|
|
00:58:34.319 --> 00:58:40.160 |
|
model nowadays and ask you to fill in um |
|
|
|
00:58:37.960 --> 00:58:42.839 |
|
ask you to fill in the parts we're going |
|
|
|
00:58:40.160 --> 00:58:45.920 |
|
to train a very small version of llama |
|
|
|
00:58:42.839 --> 00:58:47.319 |
|
on a small data set and get it to work |
|
|
|
00:58:45.920 --> 00:58:48.880 |
|
and the reason why it's very small is |
|
|
|
00:58:47.319 --> 00:58:50.480 |
|
because the smallest actual version of |
|
|
|
00:58:48.880 --> 00:58:53.039 |
|
llama is 7 billion |
|
|
|
00:58:50.480 --> 00:58:55.359 |
|
parameters um and that might be a little |
|
|
|
00:58:53.039 --> 00:58:58.400 |
|
bit difficult to train with |
|
|
|
00:58:55.359 --> 00:59:00.680 |
|
resources um for assignment two we're |
|
|
|
00:58:58.400 --> 00:59:04.559 |
|
going to try to do an NLP task from |
|
|
|
00:59:00.680 --> 00:59:06.920 |
|
scratch and so the way this will work is |
|
|
|
00:59:04.559 --> 00:59:08.520 |
|
we're going to give you an assignment |
|
|
|
00:59:06.920 --> 00:59:10.880 |
|
which we're not going to give you an |
|
|
|
00:59:08.520 --> 00:59:13.400 |
|
actual data set and instead we're going |
|
|
|
00:59:10.880 --> 00:59:15.760 |
|
to ask you to uh perform data creation |
|
|
|
00:59:13.400 --> 00:59:19.359 |
|
modeling and evaluation for a specified |
|
|
|
00:59:15.760 --> 00:59:20.640 |
|
task and so we're going to tell you uh |
|
|
|
00:59:19.359 --> 00:59:22.599 |
|
what to do but we're not going to tell |
|
|
|
00:59:20.640 --> 00:59:26.400 |
|
you exactly how to do it but we're going |
|
|
|
00:59:22.599 --> 00:59:29.680 |
|
to try to give as conrete directions as |
|
|
|
00:59:26.400 --> 00:59:32.359 |
|
we can um |
|
|
|
00:59:29.680 --> 00:59:34.160 |
|
yeah will you be given a parameter limit |
|
|
|
00:59:32.359 --> 00:59:36.559 |
|
on the model so that's a good question |
|
|
|
00:59:34.160 --> 00:59:39.119 |
|
or like a expense limit or something |
|
|
|
00:59:36.559 --> 00:59:40.440 |
|
like that um I maybe actually I should |
|
|
|
00:59:39.119 --> 00:59:44.240 |
|
take a break from the assignments and |
|
|
|
00:59:40.440 --> 00:59:46.520 |
|
talk about compute so right now um for |
|
|
|
00:59:44.240 --> 00:59:49.319 |
|
assignment one we're planning on having |
|
|
|
00:59:46.520 --> 00:59:51.599 |
|
this be able to be done either on a Mac |
|
|
|
00:59:49.319 --> 00:59:53.520 |
|
laptop with an M1 or M2 processor which |
|
|
|
00:59:51.599 --> 00:59:57.079 |
|
I think a lot of people have or Google |
|
|
|
00:59:53.520 --> 00:59:59.839 |
|
collab um so it should be like |
|
|
|
00:59:57.079 --> 01:00:02.160 |
|
sufficient to use free computational |
|
|
|
00:59:59.839 --> 01:00:03.640 |
|
resources that you have for number two |
|
|
|
01:00:02.160 --> 01:00:06.079 |
|
we'll think about that I think that's |
|
|
|
01:00:03.640 --> 01:00:08.280 |
|
important we do have Google cloud |
|
|
|
01:00:06.079 --> 01:00:11.520 |
|
credits for $50 for everybody and I'm |
|
|
|
01:00:08.280 --> 01:00:13.440 |
|
working to get AWS credits for more um |
|
|
|
01:00:11.520 --> 01:00:18.160 |
|
but the cloud providers nowadays are |
|
|
|
01:00:13.440 --> 01:00:19.680 |
|
being very stingy so um so it's uh been |
|
|
|
01:00:18.160 --> 01:00:22.160 |
|
a little bit of a fight to get uh |
|
|
|
01:00:19.680 --> 01:00:23.680 |
|
credits but I I it is very important so |
|
|
|
01:00:22.160 --> 01:00:28.480 |
|
I'm going to try to get as as many as we |
|
|
|
01:00:23.680 --> 01:00:31.119 |
|
can um and so yeah I I think basically |
|
|
|
01:00:28.480 --> 01:00:32.280 |
|
uh there will be some sort of like limit |
|
|
|
01:00:31.119 --> 01:00:34.480 |
|
on the amount of things you can |
|
|
|
01:00:32.280 --> 01:00:36.240 |
|
practically do and so because of that |
|
|
|
01:00:34.480 --> 01:00:39.920 |
|
I'm hoping that people will rely very |
|
|
|
01:00:36.240 --> 01:00:43.359 |
|
heavily on pre-trained models um or uh |
|
|
|
01:00:39.920 --> 01:00:46.079 |
|
yeah pre-trained models |
|
|
|
01:00:43.359 --> 01:00:49.599 |
|
and yeah so that that's the the short |
|
|
|
01:00:46.079 --> 01:00:52.799 |
|
story B um the second thing uh the |
|
|
|
01:00:49.599 --> 01:00:54.720 |
|
assignment three is to do a survey of |
|
|
|
01:00:52.799 --> 01:00:57.920 |
|
some sort of state-ofthe-art research |
|
|
|
01:00:54.720 --> 01:01:00.760 |
|
resarch and do a reimplementation of |
|
|
|
01:00:57.920 --> 01:01:02.000 |
|
this and in doing this again you will |
|
|
|
01:01:00.760 --> 01:01:03.440 |
|
have to think about something that's |
|
|
|
01:01:02.000 --> 01:01:06.359 |
|
feasible within computational |
|
|
|
01:01:03.440 --> 01:01:08.680 |
|
constraints um and so you can discuss |
|
|
|
01:01:06.359 --> 01:01:11.839 |
|
with your Tas about uh about the best |
|
|
|
01:01:08.680 --> 01:01:13.920 |
|
way to do this um and then the final |
|
|
|
01:01:11.839 --> 01:01:15.400 |
|
project is to perform a unique project |
|
|
|
01:01:13.920 --> 01:01:17.559 |
|
that either improves on the state-of-the |
|
|
|
01:01:15.400 --> 01:01:21.000 |
|
art with respect to whatever you would |
|
|
|
01:01:17.559 --> 01:01:23.440 |
|
like to improve with this could be uh |
|
|
|
01:01:21.000 --> 01:01:25.280 |
|
accuracy for sure this could be |
|
|
|
01:01:23.440 --> 01:01:27.760 |
|
efficiency |
|
|
|
01:01:25.280 --> 01:01:29.599 |
|
it could be some sense of |
|
|
|
01:01:27.760 --> 01:01:31.520 |
|
interpretability but if it's going to be |
|
|
|
01:01:29.599 --> 01:01:33.599 |
|
something like interpretability you'll |
|
|
|
01:01:31.520 --> 01:01:35.440 |
|
have to discuss with us what that means |
|
|
|
01:01:33.599 --> 01:01:37.240 |
|
like how we measure that how we can like |
|
|
|
01:01:35.440 --> 01:01:40.839 |
|
actually say that you did a good job |
|
|
|
01:01:37.240 --> 01:01:42.839 |
|
with improving that um another thing |
|
|
|
01:01:40.839 --> 01:01:44.680 |
|
that you can do is take whatever you |
|
|
|
01:01:42.839 --> 01:01:47.280 |
|
implemented for assignment 3 and apply |
|
|
|
01:01:44.680 --> 01:01:49.039 |
|
it to a new task or apply it to a new |
|
|
|
01:01:47.280 --> 01:01:50.760 |
|
language that has never been examined |
|
|
|
01:01:49.039 --> 01:01:53.119 |
|
before so these are also acceptable |
|
|
|
01:01:50.760 --> 01:01:54.240 |
|
final projects but basically the idea is |
|
|
|
01:01:53.119 --> 01:01:55.559 |
|
for the final project you need to do |
|
|
|
01:01:54.240 --> 01:01:57.480 |
|
something something new that hasn't been |
|
|
|
01:01:55.559 --> 01:01:59.880 |
|
done before and create new knowledge |
|
|
|
01:01:57.480 --> 01:02:04.520 |
|
with the respect |
|
|
|
01:01:59.880 --> 01:02:07.640 |
|
toy um so for this the instructor is me |
|
|
|
01:02:04.520 --> 01:02:09.920 |
|
um I'm uh looking forward to you know |
|
|
|
01:02:07.640 --> 01:02:13.599 |
|
discussing and working with all of you |
|
|
|
01:02:09.920 --> 01:02:16.119 |
|
um for TAS we have seven Tas uh two of |
|
|
|
01:02:13.599 --> 01:02:18.319 |
|
them are in transit so they're not here |
|
|
|
01:02:16.119 --> 01:02:22.279 |
|
today um the other ones uh Tas would you |
|
|
|
01:02:18.319 --> 01:02:22.279 |
|
mind coming up uh to introduce |
|
|
|
01:02:23.359 --> 01:02:26.359 |
|
yourself |
|
|
|
01:02:28.400 --> 01:02:32.839 |
|
so um yeah nhir and akshai couldn't be |
|
|
|
01:02:31.599 --> 01:02:34.039 |
|
here today because they're traveling |
|
|
|
01:02:32.839 --> 01:02:37.119 |
|
I'll introduce them later because |
|
|
|
01:02:34.039 --> 01:02:37.119 |
|
they're coming uh next |
|
|
|
01:02:40.359 --> 01:02:46.480 |
|
time cool and what I'd like everybody to |
|
|
|
01:02:43.000 --> 01:02:48.680 |
|
do is say um like you know what your |
|
|
|
01:02:46.480 --> 01:02:53.079 |
|
name is uh what |
|
|
|
01:02:48.680 --> 01:02:55.799 |
|
your like maybe what you're interested |
|
|
|
01:02:53.079 --> 01:02:57.319 |
|
in um and the reason the goal of this is |
|
|
|
01:02:55.799 --> 01:02:59.200 |
|
number one for everybody to know who you |
|
|
|
01:02:57.319 --> 01:03:00.720 |
|
are and number two for everybody to know |
|
|
|
01:02:59.200 --> 01:03:03.440 |
|
who the best person to talk to is if |
|
|
|
01:03:00.720 --> 01:03:03.440 |
|
they're interested in |
|
|
|
01:03:04.200 --> 01:03:09.079 |
|
particular hi uh I'm |
|
|
|
01:03:07.000 --> 01:03:15.520 |
|
Aila second |
|
|
|
01:03:09.079 --> 01:03:15.520 |
|
year I work on language and social |
|
|
|
01:03:16.200 --> 01:03:24.559 |
|
and I'm I'm a second this year PhD |
|
|
|
01:03:21.160 --> 01:03:26.799 |
|
student Grand and Shar with you I search |
|
|
|
01:03:24.559 --> 01:03:28.480 |
|
is like started in the border of MP and |
|
|
|
01:03:26.799 --> 01:03:31.000 |
|
computer interaction with a lot of work |
|
|
|
01:03:28.480 --> 01:03:32.640 |
|
on automating parts of the developer |
|
|
|
01:03:31.000 --> 01:03:35.319 |
|
experience to make it easier for anyone |
|
|
|
01:03:32.640 --> 01:03:35.319 |
|
to |
|
|
|
01:03:39.090 --> 01:03:42.179 |
|
[Music] |
|
|
|
01:03:47.520 --> 01:03:53.279 |
|
orif |
|
|
|
01:03:50.079 --> 01:03:54.680 |
|
everyone first |
|
|
|
01:03:53.279 --> 01:03:57.119 |
|
year |
|
|
|
01:03:54.680 --> 01:04:00.119 |
|
[Music] |
|
|
|
01:03:57.119 --> 01:04:03.559 |
|
I don't like updating primar models I |
|
|
|
01:04:00.119 --> 01:04:03.559 |
|
hope to not update Prim |
|
|
|
01:04:14.599 --> 01:04:19.400 |
|
modelm yeah thanks a lot everyone and |
|
|
|
01:04:17.200 --> 01:04:19.400 |
|
yeah |
|
|
|
01:04:20.839 --> 01:04:29.400 |
|
than and so we will um we'll have people |
|
|
|
01:04:25.640 --> 01:04:30.799 |
|
uh kind of have office hours uh every ta |
|
|
|
01:04:29.400 --> 01:04:32.880 |
|
has office hours at a regular time |
|
|
|
01:04:30.799 --> 01:04:34.480 |
|
during the week uh please feel free to |
|
|
|
01:04:32.880 --> 01:04:38.400 |
|
come to their office hours or my office |
|
|
|
01:04:34.480 --> 01:04:41.960 |
|
hours um I think they are visha are they |
|
|
|
01:04:38.400 --> 01:04:43.880 |
|
posted on the site or okay yeah they |
|
|
|
01:04:41.960 --> 01:04:47.240 |
|
they either are or will be posted on the |
|
|
|
01:04:43.880 --> 01:04:49.720 |
|
site very soon um and come by to talk |
|
|
|
01:04:47.240 --> 01:04:51.480 |
|
about anything uh if there's nobody in |
|
|
|
01:04:49.720 --> 01:04:53.079 |
|
my office hours I'm happy to talk about |
|
|
|
01:04:51.480 --> 01:04:54.599 |
|
things that are unrelated but if there's |
|
|
|
01:04:53.079 --> 01:04:58.039 |
|
lots of people waiting outside or I |
|
|
|
01:04:54.599 --> 01:05:00.319 |
|
might limit it to uh like um just things |
|
|
|
01:04:58.039 --> 01:05:02.480 |
|
about the class so cool and we have |
|
|
|
01:05:00.319 --> 01:05:04.760 |
|
Patza we'll be checking that regularly |
|
|
|
01:05:02.480 --> 01:05:06.839 |
|
uh striving to get you an answer in 24 |
|
|
|
01:05:04.760 --> 01:05:12.240 |
|
hours on weekdays over weekends we might |
|
|
|
01:05:06.839 --> 01:05:16.000 |
|
not so um yeah so that's all for today |
|
|
|
01:05:12.240 --> 01:05:16.000 |
|
are there any questions |
|
|