Skip to content

Commit

Permalink
init commit
Browse files Browse the repository at this point in the history
  • Loading branch information
jph00 committed Feb 28, 2020
1 parent 366caca commit 0a3a554
Show file tree
Hide file tree
Showing 204 changed files with 49,017 additions and 0 deletions.
2,779 changes: 2,779 additions & 0 deletions 01_intro.ipynb

Large diffs are not rendered by default.

1,792 changes: 1,792 additions & 0 deletions 02_production.ipynb

Large diffs are not rendered by default.

988 changes: 988 additions & 0 deletions 03_ethics.ipynb

Large diffs are not rendered by default.

4,881 changes: 4,881 additions & 0 deletions 04_mnist_basics.ipynb

Large diffs are not rendered by default.

2,473 changes: 2,473 additions & 0 deletions 05_pet_breeds.ipynb

Large diffs are not rendered by default.

1,905 changes: 1,905 additions & 0 deletions 06_multicat.ipynb

Large diffs are not rendered by default.

1,018 changes: 1,018 additions & 0 deletions 07_sizing_and_tta.ipynb

Large diffs are not rendered by default.

2,259 changes: 2,259 additions & 0 deletions 08_collab.ipynb

Large diffs are not rendered by default.

9,638 changes: 9,638 additions & 0 deletions 09_tabular.ipynb

Large diffs are not rendered by default.

2,360 changes: 2,360 additions & 0 deletions 10_nlp.ipynb

Large diffs are not rendered by default.

1,313 changes: 1,313 additions & 0 deletions 11_nlp_dive.ipynb

Large diffs are not rendered by default.

1,157 changes: 1,157 additions & 0 deletions 12_better_rnn.ipynb

Large diffs are not rendered by default.

2,726 changes: 2,726 additions & 0 deletions 13_convolutions.ipynb

Large diffs are not rendered by default.

1,069 changes: 1,069 additions & 0 deletions 14_deep_conv.ipynb

Large diffs are not rendered by default.

1,275 changes: 1,275 additions & 0 deletions 15_resnet.ipynb

Large diffs are not rendered by default.

476 changes: 476 additions & 0 deletions 16_arch_details.ipynb

Large diffs are not rendered by default.

985 changes: 985 additions & 0 deletions 17_accel_sgd.ipynb

Large diffs are not rendered by default.

424 changes: 424 additions & 0 deletions 18_callbacks.ipynb

Large diffs are not rendered by default.

2,415 changes: 2,415 additions & 0 deletions 19_foundations.ipynb

Large diffs are not rendered by default.

669 changes: 669 additions & 0 deletions 20_CAM.ipynb

Large diffs are not rendered by default.

1,766 changes: 1,766 additions & 0 deletions 21_learner.ipynb

Large diffs are not rendered by default.

76 changes: 76 additions & 0 deletions 22_conclusion.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
{
"cells": [
{
"cell_type": "raw",
"metadata": {},
"source": [
"[[chapter_conclusion]]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Concluding thoughts"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Congratulations! You've made it! If you have worked through all of the notebooks to this point, then you have joined a small, but growing group of people that are able to harness the power of deep learning to solve real problems. You may not feel that way; in fact you probably do not feel that way. We have seen again and again that students that complete the fast.AI courses dramatically underestimate how effective they are as deep learning practitioners. We've also seen that these people are often underestimated by those that have come out of a classic academic background. So for you to rise above your own expectations and the expectations of others what you do next, after closing this book, is even more important than what you've done to get to this point.\n",
"\n",
"The most important thing is to keep the momentum going. In fact, as you know from your study of optimisers, momentum is something which can build upon itself! So think about what it is you can do now to maintain and accelerate your deep learning journey. Here's a few ideas:"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img alt=\"What to do next\" width=\"550\" caption=\"What to do next\" id=\"do_next\" src=\"images/att_00053.png\">"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We've talked a lot in this book about the value of writing, whether it be code or prose. But perhaps you haven't quite written as much as you had hoped so far. That's okay! Now is a great chance to turn that around. You have a lot to say, at this point. Perhaps you have tried some experiments on a dataset which other people don't seem to have looked at in quite the same way — so tell the world about it! Or perhaps you are just curious to try out some ideas that you had been thinking about why you are reading; now is a great chance to turn those ideas into code.\n",
"\n",
"One fairly low-key place for your writing is the fast.ai forums at forums.fast.ai. You will find that the community there is very supportive and helpful, so please do drop by and let us know what you've been up to. Or see if you can answer a few questions for those folks who are earlier in their journey then you.\n",
"\n",
"And if you do have some success, big or small, in your deep learning journey, be sure to let us know! It's especially helpful if you post about it on the forums, because for others to learn about the successes of other students can be extremely motivating.\n",
"\n",
"Perhaps the most important approach for many people to stay connected with their learning journey is to build a community around it. For instance, you could try to set up a small deep learning Meetup in your local neighbourhood, or a study group, or even offer to do a talk at a local meet up about what you've learned so far, or some particular aspect that interested you. It is okay that you are not the world's leading expert just yet – the important thing to remember is that you now know about plenty of stuff that other people don't, so they are very likely to appreciate your perspective.\n",
"\n",
"Another community event which many people find useful is a regular book club or paper reading club. You might find that there are some in your neighbourhood already, or otherwise you could try to get one started yourself. Even if there is just one other person doing it with you, it will help give you the support and encouragement to get going.\n",
"\n",
"If you are not in a geography where it's easy to get together with like-minded folks in person, drop by the forums, because there are lots of people always starting up virtual study groups. These generally involve a bunch of people getting together over video chat once every week or so, and discussing some deep learning topic.\n",
"\n",
"Hopefully, by this point, you have a few little projects that you put together, and experiments that you've run. Our recommendation is generally to pick one of these and make it as good as you can. Really polish it up into the best piece of work that you can — something you really proud of. This will force you to go much deeper into a topic, which will really test out your understanding, and give you the opportunity to see what you can do when you really put your mind to it.\n",
"\n",
"Also, you may want to take a look at the fast.AI free online course which covers the same material as this book. Sometimes, seeing the same material in two different ways, can really help to crystallise the ideas. In fact, human learning researchers have found that this is one of the best ways to learn material — to see the same thing from different angles, described in different ways.\n",
"\n",
"Your final mission, should you choose to accept it, is to take this book, and give it to somebody that you know — and let somebody else start their way down their own deep learning journey!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"jupytext": {
"split_at_heading": true
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Loading

0 comments on commit 0a3a554

Please sign in to comment.