With python, scikit-learn, and handwritten digits. This is a notebook that I made for a high school student who visited our lab. It is a simple introduction to dimensionality reduction and should not be considered a statement that I'm an expert in unsupervised learning..
Today, I went to the 'Deep Learning and the Brain' workshop. This post includes some excellent considerations I heard there on analogies between deep artificial networks and the cortex. I have included in this post yesterday's intervention by Yoshua Bengio.
Random samples from days three and four.
Today: sensory processing, neural dynamics, Wei Ji Ma's personal life, memory, reinforcement learning.
Today: excitement, food reception, announcements, two wonderful opening talks, and a poster session full of sleepy Europeans.
I was looking at the teaching awards nominations of my university's teaching staff. I found a Zipf-like power law (for a change).
Wilson and Cowan, in the seventies, developed a dynamical systems approach to the study of the large-scale behaviour of neuronal population. Their approach doesn't mind the behaviour of single neurons, but works at the level of population firing rates (roughly, the number of neurons that fire in the unit time) for two subpopulation the inhibitory neurons and the excitatory neurons.
Recently, while discussing during a journal club, I wondered how easily a computational neuroscientist can distinguish between a real neuronal recording and a simulated dataset. So I took a real dataset, imitated it with a model, and I'm asking you to tell me which one is which.