3a3c952e75
* Added dtype formulation for poisson and gamma distributions. Signed-off-by: shugeo <sgazeos@gmail.com> * Refactored gamma distribution generator and tests. Signed-off-by: shugeo <sgazeos@gmail.com> * Added generator for gamma distribution when alpha (shape) between 0 and 1 Signed-off-by: shugeo <sgazeos@gmail.com> * Implemented gamma distribution for shape param less than 1 and tests. Signed-off-by: shugeo <sgazeos@gmail.com> * Implemented gamma distributed randoms for shape (alpha) parameter greater then 1. Signed-off-by: shugeo <sgazeos@gmail.com> * Added cuda implementation for gamma distribution. Signed-off-by: shugeo <sgazeos@gmail.com> * Refactored cuda and cpu implementation of gamma distribution. Signed-off-by: shugeo <sgazeos@gmail.com> * Fixed crash with default beta param with gamma distribution. Signed-off-by: shugeo <sgazeos@gmail.com> * Fixed pow for arm arch. Signed-off-by: shugeo <sgazeos@gmail.com> * Gamma test fixed * Cosmetic changes only. Signed-off-by: shugeo <sgazeos@gmail.com> * Fixed random value retrieving * Eliminated overflow attemptions. Signed-off-by: shugeo <sgazeos@gmail.com> * Modified random retrieving. Signed-off-by: shugeo <sgazeos@gmail.com> * enlighted density of tests for Gamma distribution. Signed-off-by: shugeo <sgazeos@gmail.com> Co-authored-by: Alexander Stoyakin <alexander.stoyakin@gmail.com> Co-authored-by: raver119 <raver119@gmail.com> |
||
---|---|---|
.github | ||
arbiter | ||
contrib | ||
datavec | ||
deeplearning4j | ||
jumpy | ||
libnd4j | ||
nd4j | ||
nd4s | ||
pydatavec | ||
pydl4j | ||
python4j | ||
rl4j | ||
scalnet | ||
.gitignore | ||
CONTRIBUTING.md | ||
Jenkinsfile | ||
LICENSE | ||
README.md | ||
change-cuda-versions.sh | ||
change-scala-versions.sh | ||
perform-release.sh | ||
pom.xml |
README.md
Monorepo of Deeplearning4j
Welcome to the new monorepo of Deeplearning4j that contains the source code for all the following projects, in addition to the original repository of Deeplearning4j moved to deeplearning4j:
- https://github.com/eclipse/deeplearning4j/tree/master/libnd4j
- https://github.com/eclipse/deeplearning4j/tree/master/nd4j
- https://github.com/eclipse/deeplearning4j/tree/master/datavec
- https://github.com/eclipse/deeplearning4j/tree/master/arbiter
- https://github.com/eclipse/deeplearning4j/tree/master/nd4s
- https://github.com/eclipse/deeplearning4j/tree/master/rl4j
- https://github.com/eclipse/deeplearning4j/tree/master/scalnet
- https://github.com/eclipse/deeplearning4j/tree/master/pydl4j
- https://github.com/eclipse/deeplearning4j/tree/master/jumpy
- https://github.com/eclipse/deeplearning4j/tree/master/pydatavec
To build everything, we can use commands like
./change-cuda-versions.sh x.x
./change-scala-versions.sh 2.xx
./change-spark-versions.sh x
mvn clean install -Dmaven.test.skip -Dlibnd4j.cuda=x.x -Dlibnd4j.compute=xx
or
mvn -B -V -U clean install -pl '!jumpy,!pydatavec,!pydl4j' -Dlibnd4j.platform=linux-x86_64 -Dlibnd4j.chip=cuda -Dlibnd4j.cuda=9.2 -Dlibnd4j.compute=<your GPU CC> -Djavacpp.platform=linux-x86_64 -Dmaven.test.skip=true
An example of GPU "CC" or compute capability is 61 for Titan X Pascal.
Want some examples?
We have separate repository with various examples available: https://github.com/eclipse/deeplearning4j-examples
In the examples repo, you'll also find a tutorial series in Zeppelin: https://github.com/eclipse/deeplearning4j-examples/tree/master/tutorials