*UPDATED* Here is a masterpost of MOOCs (massive open online courses) that are available, archived, or starting soon. Some are short, some are very interactive, some are very in-depth. I think they will help those that like to learn with a teacher or with videos. I checked each link to make sure they are functioning.
Beginner
AP Spanish Language & Culture
Basic Spanish for English Speakers
Beginner’s Spanish:Food & Drink
Fastbreak Spanish
How to Self-Study Spanish
Preparing for the AP Spanish Exam
Spanish for Beginners
Intermediate
Spanish:Ciudades con Historia
Spanish:Espacios Públicos
Advanced
Corrección, Estilo y Variaciones
La Innovación Social (Check under Translation)
Leer a Macondo (Taught in Spanish)
Spanish:Con Mis Propias Manos
Spanish: Perspectivas Porteñas
Reading Spanish Literature
Beginner
AP French Language and Culture
Basic French Skills
Beginner’s French: Food & Drink
Diploma in French
Elementary French I
Elementary French II
Français Interactif
French in Action
French Language Studies I
French Language Studies II
French Language Studies III
French:Ouverture
French Through Stories and Conversation
Improving Your French
Mastering French Grammar and Vocab
Intermediate
French: Le Quatorze Juillet
Passe Partout
Advanced
Fantasy, de l'Angleterre Victorienne au Trône de fer
La Cité des Sciences et de Industrie
Les Chansons des Troubadours
Reading French Literature
Brazilian Portuguese for Beginners
Curso de Português para Estrangeiros
Beginner
Beginner’s Italian: Food & Drink
Beginner Italian I
Introduction to Italian
Oggi e Domani
Survive Italy Without Being Fluent
Intermediate
Intermediate Italian I
Advanced
Advanced Italian I
Italian Literature
Italian Novel of the Twentieth Century
L'innovazione Sociale (Check language under translation)
Reading Italian Literature
Intro to Catalan Sign Language
Latin I (Taught in Italian)
Beginner
Basics of Russian
Easy Accelerated Learning for Russian
Russian Alphabet
Russian Essentials
Russian for Beginners
Russian Level I
Russian Phonetics and Pronunciation
Reading and Writing Russian
Travel Russian
Advanced
Business Russian (must register)
Let Us Speak Russian (must register)
Reading Master and Margarita
Russian as an Instrument of Communication
Siberia: Russian for Foreigners
Read Ukrainian
Ukrainian Language for Beginners
A1-B2 Kazakh (Taught in Russian)
Beginner
Basic Chinese
Basic Mandarin Chinese I
Basic Mandarin Chinese II
Beginner’s Chinese
Chinese for Beginners
Chinese Characters
Chinese for Travelers
Chinese is Easy
Chinese Made Easy
Easy Mandarin
First Year Chinese I
First Year Chinese II
Learn Oral Chinese
Mandarin Chinese I
Start Talking Mandarin Chinese
UT Gateway to Chinese
Intermediate
Intermediate Business Chinese
Intermediate Chinese
Intermediate Chinese Grammar
Beginner’s Conversational Japanese
Genki
Japanese JOSHU
Learn 80 JLPT N5 Kanji I
Learn 80 JLPT N5 Kanji II
Learn 80 JLPT N5 Kanji III
Learn 80 JLPT N5 Kanji IV
Beginner
First Step Korean
How to Study Korean
Pathway to Spoken Korean
Intermediate
Intermediate Korean
Introduction to Dutch
Beginner
Basic German
Basic Language Skills
Beginner’s German: Food & Drink
Conversational German I
Conversational German II
Conversational German III
Conversational German IV
Deutsch im Blick
Diploma in German
German A1 Grammar
German Alphabet
German Modal Verbs
Present Tense German
Rundblick-Beginner’s German
Study German Language from Native Speakers
Advanced
German:Regionen Traditionen und Geschichte
Landschaftliche Vielfalt
Reading German Literature
Learn The Norwegian Language
Norwegian on the Web
Intro to Swedish
A Taste of Finnish
Basic Finnish
Finnish for Immigrants
Finnish for Medical Professionals
Introduction to Frisian (Taught in Dutch)
Icelandic 1-5
Arabic for Global Exchange (in the drop down menu)
Arabic Language for Beginners
Arabic Without Walls
Conversational Arabic Made Easy
Intro to Arabic
Lebanese Arabic
Madinah Arabic
Moroccan Arabic
Read Arabic
Hebrew Alphabet Crashcourse
Know the Hebrew Alphabet
A Door into Hindi
Business Hindi
Virtual Hindi
Learn Indonesian
Beginner’s Conversation and Grammar
Beginner’s Welsh
Discovering Wales
Introduction to Irish
http://ocw.mit.edu/courses/global-studies-and-languages/ : MIT’s open courseware site has assignments and course material available.
I’ll keep an eye out for new courses and if you know of any, let me know so I can update this list.
(Fig.1. Neuron connections in biological neural networks. Source: MIPT press office)
Physicists build “electronic synapses” for neural networks
A team of scientists from the Moscow Institute of Physics and Technology(MIPT) have created prototypes of “electronic synapses” based on ultra-thin films of hafnium oxide (HfO2). These prototypes could potentially be used in fundamentally new computing systems. The paper has been published in the journal Nanoscale Research Letters.
The group of researchers from MIPT have made HfO2-based memristors measuring just 40x40 nm2. The nanostructures they built exhibit properties similar to biological synapses. Using newly developed technology, the memristors were integrated in matrices: in the future this technology may be used to design computers that function similar to biological neural networks.
Memristors (resistors with memory) are devices that are able to change their state (conductivity) depending on the charge passing through them, and they therefore have a memory of their “history”. In this study, the scientists used devices based on thin-film hafnium oxide, a material that is already used in the production of modern processors. This means that this new lab technology could, if required, easily be used in industrial processes.
“In a simpler version, memristors are promising binary non-volatile memory cells, in which information is written by switching the electric resistance – from high to low and back again. What we are trying to demonstrate are much more complex functions of memristors – that they behave similar to biological synapses,” said Yury Matveyev, the corresponding author of the paper, and senior researcher of MIPT’s Laboratory of Functional Materials and Devices for Nanoelectronics, commenting on the study.
Synapses – the key to learning and memory
A synapse is point of connection between neurons, the main function of which is to transmit a signal (a spike – a particular type of signal, see fig. 2) from one neuron to another. Each neuron may have thousands of synapses, i.e. connect with a large number of other neurons. This means that information can be processed in parallel, rather than sequentially (as in modern computers). This is the reason why “living” neural networks are so immensely effective both in terms of speed and energy consumption in solving large range of tasks, such as image / voice recognition, etc.
(Fig.2 The type of electrical signal transmitted by neurons (a “spike”). The red lines are various other biological signals, the black line is the averaged signal. Source: MIPT press office)
Over time, synapses may change their “weight”, i.e. their ability to transmit a signal. This property is believed to be the key to understanding the learning and memory functions of thebrain.
From the physical point of view, synaptic “memory” and “learning” in the brain can be interpreted as follows: the neural connection possesses a certain “conductivity”, which is determined by the previous “history” of signals that have passed through the connection. If a synapse transmits a signal from one neuron to another, we can say that it has high “conductivity”, and if it does not, we say it has low “conductivity”. However, synapses do not simply function in on/off mode; they can have any intermediate “weight” (intermediate conductivity value). Accordingly, if we want to simulate them using certain devices, these devices will also have to have analogous characteristics.
The memristor as an analogue of the synapse
As in a biological synapse, the value of the electrical conductivity of a memristor is the result of its previous “life” – from the moment it was made.
There is a number of physical effects that can be exploited to design memristors. In this study, the authors used devices based on ultrathin-film hafnium oxide, which exhibit the effect of soft (reversible) electrical breakdown under an applied external electric field. Most often, these devices use only two different states encoding logic zero and one. However, in order to simulate biological synapses, a continuous spectrum of conductivities had to be used in the devices.
“The detailed physical mechanism behind the function of the memristors in question is still debated. However, the qualitative model is as follows: in the metal–ultrathin oxide–metal structure, charged point defects, such as vacancies of oxygen atoms, are formed and move around in the oxide layer when exposed to an electric field. It is these defects that are responsible for the reversible change in the conductivity of the oxide layer,” says the co-author of the paper and researcher of MIPT’s Laboratory of Functional Materials and Devices for Nanoelectronics, Sergey Zakharchenko.
The authors used the newly developed “analogue” memristors to model various learning mechanisms (“plasticity”) of biological synapses. In particular, this involved functions such as long-term potentiation (LTP) or long-term depression (LTD) of a connection between two neurons. It is generally accepted that these functions are the underlying mechanisms of memory in the brain.
The authors also succeeded in demonstrating a more complex mechanism – spike-timing-dependent plasticity, i.e. the dependence of the value of the connection between neurons on the relative time taken for them to be “triggered”. It had previously been shown that this mechanism is responsible for associative learning – the ability of the brain to find connections between different events.
To demonstrate this function in their memristor devices, the authors purposefully used an electric signal which reproduced, as far as possible, the signals in living neurons, and they obtained a dependency very similar to those observed in living synapses (see fig. 3).
(Fig.3. The change in conductivity of memristors depending on the temporal separation between “spikes”(rigth) and the change in potential of the neuron connections in biological neural networks. Source: MIPT press office)
These results allowed the authors to confirm that the elements that they had developed could be considered a prototype of the “electronic synapse”, which could be used as a basis for the hardware implementation of artificial neural networks.
“We have created a baseline matrix of nanoscale memristors demonstrating the properties of biological synapses. Thanks to this research, we are now one step closer to building an artificial neural network. It may only be the very simplest of networks, but it is nevertheless a hardware prototype,” said the head of MIPT’s Laboratory of Functional Materials and Devices for Nanoelectronics, Andrey Zenkevich.
Fairyland, or, Through the Enchanted Forest.
[Gloucester, England], [Roberts Brothers], [ca. 1890-1915].
Bryn Mawr College Special Collections GV1469.F221 T4 1890z
This whimsical board game, with its blinding powder, cabbage, meat, and cake tokens, has it all! Wild two-headed animals ravage the forest and hapless children must make it through armed only with odd grocery items! The game is a new addition to the Bryn Mawr College Special Collections as part of the Ellery Yale Wood Collection of Children’s Books.
If this Tibetan padlock looks massive that is because it is! And it has not one but THREE equally ginormous keys to open it! It is definitely a very intriguing padlock and indeed something of a puzzle as all three keys must be fitted simultaneously for it to open. There is one lock at the top underneath a panel, another on the side, and a third under a hinged panel on the back. Quite impressive!
We think it dates from the late nineteenth or early twentieth century and was purchased for the museum by Emslie Horniman, who was the son of our founder Frederick Horniman.
Object no. 13.223
Paralyzed ALS patient operates speech computer with her mind
In the UMC Utrecht a brain implant has been placed in a patient enabling her to operate a speech computer with her mind. The researchers and the patient worked intensively to get the settings right. She can now communicate at home with her family and caregivers via the implant. That a patient can use this technique at home is unique in the world. This research was published in the New England Journal of Medicine.
Because she suffers from ALS disease, the patient is no longer able to move and speak. Doctors placed electrodes in her brain, and the electrodes pick up brain activity. This enables her to wirelessly control a speech computer that she now uses at home.
Mouse click
The patient operates the speech computer by moving her fingers in her mind. This changes the brain signal under the electrodes. That change is converted into a mouse click. On a screen in front of her she can see the alphabet, plus some additional functions such as deleting a letter or word and selecting words based on the letters she has already spelled. The letters on the screen light up one by one. She selects a letter by influencing the mouse click at the right moment with her brain. That way she can compose words, letter by letter, which are then spoken by the speech computer. This technique is comparable to actuating a speech computer via a push-button (with a muscle that can still function, for example, in the neck or hand). So now, if a patient lacks muscle activity, a brain signal can be used instead.
Wireless
The patient underwent surgery during which electrodes were placed on her brain through tiny holes in her skull. A small transmitter was then placed in her body below her collarbone. This transmitter receives the signals from the electrodes via subcutaneous wires, amplifies them and transmits them wirelessly. The mouse click is calculated from these signals, actuating the speech computer. The patient is closely supervised. Shortly after the operation, she started on a journey of discovery together with the researchers to find the right settings for the device and the perfect way to get her brain activity under control. It started with a “simple” game to practice the art of clicking. Once she mastered clicking, she focused on the speech computer. She can now use the speech computer without the help of the research team.
The UMC Utrecht Brain Center has spent many years researching the possibility of controlling a computer by means of electrodes that capture brain activity. Working with a speech computer driven by brain signals measured with a bathing cap with electrodes has long been tested in various research laboratories. That a patient can use the technique at home, through invisible, implanted electrodes, is unique in the world.
If the implant proves to work well in three people, the researchers hope to launch a larger, international trial. Ramsey: “We hope that these results will stimulate research into more advanced implants, so that some day not only people with communication problems, but also people with paraplegia, for example, can be helped.”
In general I am a casual observer and usually do not make comments, especially since I am here to learn and have no background in linguistics. But in this case I feel strongly compelled to put my 2 cents' worth of thoughts in.
Although I cannot say that I am anything like fluent, I do have a reasonable amount of Mandarin Chinese and Japanese, and I have to say the first thing I thought when I saw this article was "ah". Because although I can see how katakana is derived from Chinese, using the rather restricted stroke combinations that is the basis of all Chinese characters, the same cannot be said for hiragana, because at the very least, squiggles do not exist in Chinese, at least by the time it was exported to Japan. What you might think are squiggles in Chinese are in fact just our possibly lazy, or perhaps more elegant way of writing, the way cursive would look compared to printed letters. Hirangana bears only a superficial resemblance to Chinese and always feels like it must have another source of inspiration.
Also keep in mind that Chinese was basically an imported language into Japan, and an attempt to shoehorn Japanese sounds into Chinese characters (which I think I can safely say did not sound the same) must have been unwieldy at best. In fact, today, Japanese pronouciations of kanji differ so much from the Chinese, and often their usage too, that I would use my knowledge of the characters only as a rough starting point as to what they might mean in Japanese.
Also, I looked up Kūkai, and, to cut a long story short, he was a Japanese Buddhist monk who went to China to study the sutras, and, to quote from the Wikipedia page directly:
Kūkai arrived back in Japan in 806 as the eighth Patriarch of Esoteric Buddhism, having learnt Sanskrit and its Siddhaṃ script, studied Indian Buddhism, as well as having studied the arts of Chinese calligraphy and poetry, all with recognized masters. He also arrived with a large number of texts, many of which were new to Japan and were esoteric in character, as well as several texts on the Sanskrit language and the Siddhaṃ script.
And a quick look at the Siddham script shows that it has its roots in the Aramaic alphabet.
This is the man to whom the invention of the kana system is attributed to, and if that is the case, I see a possible connection that is as not as far-fetched as it seems.
In Japanese language, we have three types of letters, Kanji, Hiragana, Katakana.
Hiragana’s root is from old Ivrit and Palmyra letters.
The first column: Phoenician alphabet The second column: Ostracon The third column: Old Aramaic The forth column: Imperial Aramaic The fifth column: Dead Sea scrolls The sixth column: Palmyrene script The seventh column: Palmyra
The first column: Hiragana The second column: Consonants The third column: Vowels The forth column: combined with the consonant and the vowel The fifth column: Sousho-tai (a hand writing style) The sixth column: Kanji
Awesome things you can do (or learn) through TensorFlow. From the site:
Um, What Is a Neural Network?
It’s a technique for building a computer program that learns from data. It is based very loosely on how we think the human brain works. First, a collection of software “neurons” are created and connected together, allowing them to send messages to each other. Next, the network is asked to solve a problem, which it attempts to do over and over, each time strengthening the connections that lead to success and diminishing those that lead to failure. For a more detailed introduction to neural networks, Michael Nielsen’s Neural Networks and Deep Learning is a good place to start. For more a more technical overview, try Deep Learning by Ian Goodfellow, Yoshua Bengio, and Aaron Courville.
GitHub
h-t FlowingData
A reblog of nerdy and quirky stuff that pique my interest.
291 posts