Quantcast

DeepLearning.scala, a toolkit aimed to create complex neural networks

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

DeepLearning.scala, a toolkit aimed to create complex neural networks

杨博

Happy New Year Everyone,

We Big Data Team of ThoughtWorks China, open-sourced DeepLearning.scala, a toolkit aimed to create complex neural networks.

With the help of DeepLearning.scala, a normal programmer is able to build complex neural networks from simple code. He still writes code as usual, and the only difference is that the code with DeepLearning.scala are differentiable, which let the code evolve itself and modify its internal parameters continuously.

Features

Differentiable basic types

Like Theano or other deep learning toolkits, DeepLearning.scala allows you to build neural networks from mathematical formula, which handle floats, doubles,GPU-accelerated N-dimensional arrays, and calculate derivatives of the weights in the formula.

Differentiable ADT

Neural networks created by DeepLearning.scala are able to handle ADT data structures (e.g. HList and Coproduct), and calculate derivatives through these data structure.

Differentiable control flow

Neural networks created by DeepLearning.scala may contain control flows likeif/else/switch/case. Combining with ADT data structures, You can implement arbitary algorithms inside neural networks, and train variables used in the algorithms.

Composibility

Neural networks created by DeepLearning.scala are composible. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network will affect the other network.

Static type system

All the above features are statically type checked.

Roadmap

v1.0

Version 1.0 is the current version with all above features. We have published the 1.0.0-RC4 version on Maven Central repository. If there is no critical issues found, the final version 1.0.0 will be released in a few days.

v2.0

  • Support for/while and other higher-order functions on differenitableSeqs.
  • Support for/while and other higher-order functions on GPU-accelerated differenitable N-dimensional arrays.

Version 2.0 will be released in March 2017.

v3.0

  • Support using custom case classes inside neural networks.
  • Support distributed models and distributed training on Spark.

Version 3.0 will be released in late 2017.

Acknowledges

DeepLearning.scala is heavily inspired by my colleague @MarisaKirisame.

@milessabin's shapeless provides a solid foundation for type-level programming used in DeepLearning.scala.

--
You received this message because you are subscribed to the Google Groups "scala-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Loading...