Skip to content

Commit

Permalink
improve headers
Browse files Browse the repository at this point in the history
  • Loading branch information
FlorianDietz committed Jun 11, 2024
1 parent 4738a88 commit 9919206
Showing 1 changed file with 11 additions and 9 deletions.
20 changes: 11 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,12 @@
- [Installation](#installation)
- [Usage](#usage)
- [Tutorial](#tutorial---debugging-an-example-network)
- [The task and the architecture](#the-task-and-the-architecture)
- [Initial exploration](#initial-exploration)
- [Finding the bug](#finding-the-bug)
- [Other features](#other-features)
- [The Task and the Architecture](#the-task-and-the-architecture)
- [Initial Exploration](#initial-exploration)
- [Finding the Bug](#finding-the-bug)
- [Other Features](#other-features)
- [Customization](#customization)
- [Known Issues](#known-issues)
- [Future Development](#future-development)

## Overview
Expand Down Expand Up @@ -92,7 +94,7 @@ comgra --path "/my/path/for/storing/data/name_of_experiment_group"
Note that --path should include both the "comgra_root_path" and the "group" parameter you gave to ComgraRecorder.


## Tutorial - Debugging an example network
## Tutorial - Debugging an Example Network

The file `src/scripts/run.py` trains a neural network on an example task. This network contains a subtle bug, and in this tutorial we will show you how you can use comgra to find that bug.

Expand All @@ -106,15 +108,15 @@ and you can start the GUI on the data it generates by calling
comgra --use-path-for-test-run
```

### The task and the architecture
### The Task and the Architecture

We use a synthetic task that is designed to test a neural network's ability to generalize to longer sequences, while being very simple and human-interpretable. The input is a sequence of N tuples of 5 numbers between 0.0 and 1.0. The network should treat these as 5 separate sequences. Its objective is to determine which of these 5 sequences has the largest sum.

Our architecture is a simple recurrent neural network that is composed of three submodules. It's nothing fancy, but illustrates how comgra can be integrated into an architecture.

We run two variants of the architecture. The original variant contains a bug, which we will discover later in this section of the Readme. For convenience, we run both trials in one script, but in a real use case the second variant would have been implemented and run later, after finding the bug. In the GUI, you can switch between the two variants with the 'Trial' selector.

### Initial exploration
### Initial Exploration

As a first step, let's look at network summary information, graphs, and the notes created by the script. To do so, select "Network" and "Notes" respectively at the main radio button at the top left of the screen.

Expand Down Expand Up @@ -205,7 +207,7 @@ The best way to get a feel for the GUI is to try it yourself. There is a lot of

All of these explorative analyses are easy and fast to do, so they are often worth the time. They can save you hours or days of frustration by catching a simple mistake early. Even if you find no mistakes, they can still be helpful to get a better understanding of your network's behavior.

### Finding the bug
### Finding the Bug

As we noticed before, the code works, but not very well.

Expand All @@ -230,7 +232,7 @@ If you inspect this second trial, you will notice that the bug is gone and perfo

This was the sort of bug that harmed performance but still allowed the network to solve the task. Without the inspection abilities offered by comgra, it would normally be next to impossible to detect a bug like this.

## Other features
## Other Features

Comgra gives you a lot of different ways to look at your data. Here are some suggestions for what you can do with it:

Expand Down

0 comments on commit 9919206

Please sign in to comment.