Commit Graph

50 Commits (b686368b82027a756572799182b14c97f8892b44)

Author SHA1 Message Date
Andrii T c34790f932
Copied and pasted RegressionTest100b4.java to RegressionTest100b6.jav… (#215)
* Copied and pasted RegressionTest100b4.java to RegressionTest100b6.java with renamed b4->b6

* assertEquals > assertTrue for half dtype

Signed-off-by: atuzhykov <andrewtuzhukov@gmail.com>
2020-02-14 11:53:35 +11:00
Alexander Stoyakin 8c0e378ec3
Improving SameDiff tests coverage (#227)
* Gradients tests added

* Fix for Standard deviation serialization + test

Signed-off-by: Alex Black <blacka101@gmail.com>

* More fixes

Signed-off-by: Alex Black <blacka101@gmail.com>

* Test fixed

* Spark config driver host config for CI

Signed-off-by: Alex Black <blacka101@gmail.com>

* Op validation timeout increase

Signed-off-by: Alex Black <blacka101@gmail.com>

* Gradient check - fix for low probability test failure due to randomly all 0s mask

Signed-off-by: AlexDBlack <blacka101@gmail.com>

Co-authored-by: Alex Black <blacka101@gmail.com>
2020-02-13 10:29:08 +11:00
Alex Black 569a46f87d
Fixes (#213)
* Increase timeouts for 2 tests occasionally failing on CI

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Explicitly set character encoding via argline for maven surefire tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* CUDA gradient check timeout fix + simple rnn masking fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2020-02-05 17:07:36 +11:00
Alex Black 57d5eb473b
Fixes for global pooling + masking with different mask datatypes (#212)
* Fixes for global pooling + masking with different mask datatypes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Global pooling backprop dtype fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2020-02-04 15:38:06 +11:00
Alex Black 0756e3fe70
Small fixes. (#206)
* Logging format tweaks for file logging

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Min abs error tweak for Util layer gradient checks

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8648 Fix SameDiff NPE instead of error for missing placeholders

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Test runtime reduction

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2020-02-01 18:19:36 +11:00
raver119 7ef0ef907e
Packages fix (#193)
* packages fix

Signed-off-by: raver119 <raver119@gmail.com>

* few imports fixed

Signed-off-by: raver119 <raver119@gmail.com>

* few imports fixed

Signed-off-by: raver119 <raver119@gmail.com>
2020-01-27 23:04:21 +03:00
Alexander Stoyakin 4db28a9300 Cleanup of multiple projects (#175)
* Cleanup modules

* Moving subprojects to nd4j-api

* Project cleanup

* Dropped AWS sub-project

* dl4j-util moved to core

* dl4j-perf moved to core

* Tests coverage

* Revert "Moving subprojects to nd4j-api"

This reverts commit bc6eb573c6b60c407ade47172c5d204725077e6b.

* Moved nd4j-buffer and nd4j-context to nd4j-api

* Rolled back change

* Revert "Project cleanup"

This reverts commit 64ac7f369b2d968f7be437718034f093fc886ffc.

* Datavec cleaned up

* Revert "Moved nd4j-buffer and nd4j-context to nd4j-api"

This reverts commit 75f4e8da80d2551e44e1251dd6c5923289fff8e1.

# Conflicts:
#	nd4j/nd4j-backends/nd4j-tests/src/test/java/org/nd4j/autodiff/opvalidation/ReductionBpOpValidation.java

* Resolve conflict

* Compilation fixed.

* nd4j-context and nd4j-buffer moved to nd4j-api

* Fixed TF mapping for mmul

* Fix for dl4j-cuda tests

Signed-off-by: Alex Black <blacka101@gmail.com>

* Move last few tests from deeplearning4j-nn to -core

Signed-off-by: Alex Black <blacka101@gmail.com>

* Remove incorrect TF import mapping for TensorMmul op

Signed-off-by: Alex Black <blacka101@gmail.com>

* Cleaned TF mapping

* Fix path for test results on windows

* Remove old dependency

Signed-off-by: Alex Black <blacka101@gmail.com>

* One more attempt to fix path for test results on windows

* fixup! One more attempt to fix path for test results on windows

* fixup! One more attempt to fix path for test results on windows

Co-authored-by: Alex Black <blacka101@gmail.com>
Co-authored-by: Serhii Shepel <9946053+sshepel@users.noreply.github.com>
Co-authored-by: raver119 <raver119@gmail.com>
2020-01-24 22:35:00 +03:00
Alex Black a25bb6a11c
Unit/integration test split + test speedup (#166)
* Add maven profile + base tests methods for integration tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Switch from system property to environment variable; seems more reliable in intellij

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add nd4j-common-tests module, and common base test; cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Ensure all ND4J tests extend BaseND4JTest

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Test spam reduction, import fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add test logging to nd4j-aeron

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix unintended change

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Reduce sprint test log spam

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More test spam cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Significantly speed up TSNE tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* W2V iterator test unit/integration split

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More NLP test speedups

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Avoid debug/verbose mode leaking between tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* test tweak

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Arbiter extends base DL4J test

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Arbiter test speedup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* nlp-uima test speedup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More test speedups

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix ND4J base test

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Few small ND4J test speed improvements

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J tests speedup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More tweaks

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Even more test speedups

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More tweaks

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Various test fixes

Signed-off-by: Alex Black <blacka101@gmail.com>

* More test fixes

Signed-off-by: Alex Black <blacka101@gmail.com>

* Add ability to specify number of threads for C++ ops in BaseDL4JTest and BaseND4JTest

Signed-off-by: Alex Black <blacka101@gmail.com>

* nd4j-aeron test profile fix for CUDA

Signed-off-by: Alex Black <blacka101@gmail.com>
2020-01-22 22:27:01 +11:00
Alex Black 29104083cc
Various fixes (#143)
* #8568 ArrayUtil optimization

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #6171 Keras ReLU and ELU support

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Keras softmax layer import

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8549 Webjars dependency management

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for TF import names ':0' suffix issue / NPE

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* BiasAdd: fix default data format for TF import

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Update zoo test ignores

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8509 SameDiff Listener API - provide frame + iteration

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8520 ND4J Environment

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Deconv3d

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Deconv3d fixes + gradient check

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Conv3d fixes + deconv3d DType test

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix issue with deconv3d gradinet check weight init

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8579 Fix BaseCudaDataBuffer constructor fix for UINT16

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DataType.isNumerical() returns false for BOOL type

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8504 Reduce Spark log spam for tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Clean up DL4J gradient check test spam

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More Gradient check spam reduction

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* SameDiff test spam reduction

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fixes for FlatBuffers mapping

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* SameDiff log spam cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Tests should extend BaseNd4jTest

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove debug line in c++ op

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* ND4J test spam cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J test spam reduction

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More Dl4J and datavec test spam cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for bad conv3d test

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Additional test

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Embedding layers: don't inherit global default activation function

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Trigger CI

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Consolidate all BaseDL4JTest classes to single class used everywhere; make timeout configurable per class

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Test fixes and timeout increases

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Timeouts and PReLU fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Restore libnd4j build threads arg for CUDA build

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Increase timeouts on a few tests to avoid spurious failures on some CI machines

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More timeout fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More test timeout fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Tweak timeout for one more test

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Final tweaks

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* One more ignore

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2020-01-04 13:45:07 +11:00
Alexander Stoyakin 010744ef9c Lu wrapper and tests fixes (#144)
* Tests fixed

* Lu added

* Test fixed

* Default timeout

* Tests timeouts fixed.

* TF import fix

* Timeouts added

* Timeout fixed.

* Test corrected

* rgb and yiq conversion ops added

* Converter ops added

* Header

* Yuv converters

* API added

* Empty test for matmul

* Explanation

* skip gemm/gemv on empty inputs

Signed-off-by: raver119 <raver119@gmail.com>

* Test added

* Correct test

* one more empty pass-through for mmul

Signed-off-by: raver119 <raver119@gmail.com>

* Cleanup

* Test added

* Test fixed

* Added missing mapping

* Added missing mapping

Co-authored-by: raver119 <raver119@gmail.com>
2019-12-30 15:06:12 +03:00
Alexander Stoyakin f5068f3980 Added missing Java ops wrappers (#122)
* Timeouts added

* Added some ops

* Ops added

* Fixed tests

* Minor fix

* Some fixes

* Digamma added

* Small fixes

* Timeouts added

* Added some ops

* Ops added

* Fixed tests

* Minor fix

* Some fixes

* Digamma added

* Small fixes

* Fused batch norm fixes-

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Tests switched off.

* Added test for resize_bicubic.

* Eliminated wasted in test of bicubic resize.

* Switched off multithreading explicit.

* HsvToRgb and RgbToHsv added

* Eliminated waste comments and conform proper float constants.

Signed-off-by: shugeo <sgazeos@gmail.com>

* Fixed multithreading with resize_bicubic helper for cpu platform.

Signed-off-by: shugeo <sgazeos@gmail.com>

* ResizeBicubic was fixed.

* Some fixes

* Fix op name

* Validation fixed.

* Clarifications for tests

* Wrappers and small fixes for new ops.
2019-12-19 20:15:48 +11:00
Alex Black 9cc8803b8d
DL4J + Keras import: Causal Conv1D support (#107)
* Keras causal conv1d support first steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Causal conv mode

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Gradient check and fixes for causal conv1d

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix Conv1D import and testing

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small keras test fix

Signed-off-by: Alex Black <blacka101@gmail.com>

* Don't allow setting causal convolution mode to conv2d/3d layers

Signed-off-by: Alex Black <blacka101@gmail.com>

* More robustly infer nIn for recurrent layers for ambiguous NCW and NWC cases

Signed-off-by: Alex Black <blacka101@gmail.com>

* Polish and cleanup

Signed-off-by: Alex Black <blacka101@gmail.com>
2019-12-04 22:52:06 +11:00
Samuel Audet 5e07998e59 Add support for CUDA 10.2 (#89) 2019-11-29 16:31:03 +11:00
Alex Black 8843c7377a
Update shaded Jackson version to 2.10.1 (#82)
* Update shaded Jackson version to 2.10.1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove no longer needed scala compiler plugin from UI

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix op name for BitwiseAnd op

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* TimeDistributedLayer mask array fix + test

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-26 19:24:38 +11:00
Alex Black 5b2ee72673
DL4J Time Distributed + fixes + Vertx module profiles fix (#78)
* Add test profiles to vertx module

* Arbiter test tweaks

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add TimeDistributed wrapper layer

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Tests for TimeDistributed layer

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small test dependency exclusion for Spark module

* Fixes, more thorough tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-25 16:00:21 +11:00
Alex Black e910ce75ec
Various Fixes (#75)
* #8431 Cast loss function weights array automatically

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add 'regex verbose mode' printing (ExecDebugListener) for TFGraphTestAllSameDiff'

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Class import mapping fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Reshape fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Don't swallow first exception in NativeOpExecutioner.exec(CustomOp)

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-23 20:06:12 +11:00
Alex Black 4a2fedf3e7
DL4J: Add Sparse multi-class cross entropy loss function (#72)
* #8432 Add sparse mcxent loss

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fixes for LossSparseMCXENT

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* add simple debugging listener for SameDiff exec debugging

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Extra gradient check + header polishing

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-22 18:54:31 +11:00
Alex Black 8f96f71f2b
Mist gradient check (#57)
Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-19 00:12:59 +11:00
Alex Black a856922fe9
#8409 Fix compgraph backprop issue with dual embedding layers from single input (#52)
Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-16 23:09:41 +11:00
Alex Black 09a827fb6d
Fixes and pre-release QA (#51)
* #8395 Keras import - support scaled identity weight init

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More Keras scaled weight init fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8352 Deprecate duplicate SamplingDataSetIterator class

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove /O2 optimization for faster CUDA build

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Tweak regression test precision for CUDA

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix edge cases for buffer creation

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Update MKLDNN validation tests to new helper enable/disable settings

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Delete debugging class

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* MKLDNN test - add proper skip for CUDA backend

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Align WeightInitUtil with weight init classes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for SameDiff test layers weight init when using IWeightInit classes

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-16 17:04:29 +11:00
Alex Black 47d19908f4
Various fixes (#43)
* #8172 Enable DL4J MKLDNN batch norm backward pass

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8382 INDArray.toString() rank 1 brackets / ambiguity fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8308 Fix handful of broken links (inc. some in errors)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Unused dependencies, round 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Unused dependencies, round 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Unused dependencies, round 3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Uniform distribution TF import fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-14 19:38:20 +11:00
Alex Black 18c01f5bdc
Add SameDiff memory reuse memory manager (array cache) (#39)
* Attention op comments

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* ArrayCacheMemoryMgr - first pass

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Tweak array cache for use with SameDiff identity arrays

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* ArrayCacheMemoryMgr javadoc and properly get max memory

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* LRU cache policy + add tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Resize arrays internally if required for ArrayCacheMemoryMgr

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Test improvement

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small polish

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-12 21:15:44 +11:00
Alex Black 9efd811508
Use DL4J workspaces for SameDiff layers in MLN/CG (#23)
* #8329 DL4J workspace integration for SameDiff layers

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix bug for Nd4j.createUninitializedDetached for scalars (length 0 shape array)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* SameDiff output layer, graph vertex, various fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Javadoc

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-02 17:42:01 +11:00
Alex Black d82877b18b
Various SameDiff fixes (#21)
* MKLDNN LSTM forward implementation (disabled pending #8331)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8318 add SameDiff.calculateGradientsAndOutputs

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Disable mkldnn backprop for now - pending fix, issue #8335

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8337 Fix CudaExecutioner unnecessary result array allocation/replacement

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small FlatBuffers serde fix, UInt8

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8135 ImagePreProcessingScaler - add segmentation support

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8319 Ensure listeners are called when they are supposed to be called

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8214 UNet (non-pretrained) last conv layer kernal size fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-11-02 11:25:53 +11:00
Alexander Stoyakin 45a40c8a89
DL4J/ND4J: Do pass on integer casts (#15)
* Int cast fixes.

* Revert "Int cast fixes."

This reverts commit aa36e8ca

* Int casts

* Int cast

* Int casts

* Get rid of int casts. Dropping deprecated aggregate ops.

* java scatterUpdate changes

Signed-off-by: raver119 <raver119@gmail.com>

* c++ scatterUpdate changes

Signed-off-by: raver119 <raver119@gmail.com>

* Remove aggregated ops.

* Restored test

* Tests restored.

* Minor fixes
2019-10-31 11:23:09 +02:00
Alex Black d333d29099
SameDiff cleanup and fixes (#12)
* #8160 Remove resolvePrepertiesFromSameDiffBeforeExecution

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* SameDiff API cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More SameDiff cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8248 Switch SameDiff variable init from lazy to creation time for more predictable behaviour

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8252 TanhDerivative javadoc

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8225 Deconvolution2D input validation

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8265 Switch SameDiff.outputs() to user settable, instead of unreliable 'best guess'

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8224 SameDiff.zero and .one create constants, not variables

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More cleanup and fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small test fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J SameDiff fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Re-add hack for Deconvolution2DLayer until #8315 is resolved

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8270 Move CUDA device/version logging to Java; can be disabled via existing org.nd4j.log.initialization system property

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* All ND4J init logging checks system property

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small tweak

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove redundant device logging

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* One more fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* UX improvements

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Deconv fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add deconv tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove debug code

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-10-26 12:38:08 +11:00
Alexander Stoyakin d98784197a
[WIP] Some optimizations in dl4j (#11)
* Minor optimization.

* Reduce number of objects.

* Extend arrays when limit reached

* Test

* Some fixes.

* Small fix

* Wrong condition fixed.

* Fixes of reallocation.

* Small fix.

* Tests

* Clean up

* Test added.

* Tests and some fixes.

* Test

* Test fixed.

* Conflict fixed.

* UX improved
2019-10-24 14:15:50 +03:00
Alex Black 3f0b4a2d4c
SameDiff execution, TF and memory management overhaul (#10)
* SameDiff execution memory management improvements, round 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Round 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Round 3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Clear node outputs closed array references; Slight change to OpValidation internals to not rely on cached op outputs

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Next steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Next step

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More polish

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add WeakIdentityHashmap

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Session fixes for control ops and next steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* First steps for training session + in-line updating

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Next steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix losses and history during training

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* BiasAdd and other fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Don't use SDVariable.getArr() in TFGraphTestAllHelper (import tests)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* First steps for new dependency tracking approach

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Start integrating dependency tracking for memory management

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Non-control op dependency tracking works/passes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Switch/merge

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Next steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup and next steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix issue dependency tracking for initial variables/constants

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add check for aliases when determining if safe to close array

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* First pass on new TF graph import class

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Import fixes, op fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup and fixes for new TF import mapper

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Partial implementation of new dependency tracker

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Next steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* AbstractDependencyTracker for shared code

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Overhaul SameDiff graph execution (dependency tracking)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More fixes, cleanup, next steps

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Ad no-op memory manager, cleanup, fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix switch dependency tracking

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* INDArray.toString: no exception on closed arrays, just note closed

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix enter and exit dependency tracking

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* TensorArray memory management fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add unique ID for INDArray instances

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix memory management for NextIteration outputs in multi-iteration loops

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove (now unnecessary) special case handling for nested enters

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Handle control dependencies during execution; javadoc for memory managers

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup, polish, code comments, javadoc

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup and more javadoc

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add memory validation for all TF import tests - ensure all arrays (except outputs) are released

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Clean up arrays waiting on unexecuted ops at the end of execution

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fixes for enter op memory managent in the context of multiple non-nested loops/frames

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix order of operation issues for dependency tracker

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Always clear op fields after execution to avoid leaks or unintended array reuse

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Re-implement dtype conversion

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for control dependencies execution (dependency tracking)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix TF import overrides and filtering

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for constant enter array dependency tracking

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J Fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More DL4J fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup and polish

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More polish and javadoc

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More logging level tweaks, small DL4J fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix to DL4J SameDiffLayer

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix empty array deserialization, add extra deserialization checks

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* FlatBuffers control dep serialization fixes; test serialization as part of all TF import tests

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Variable control dependencies serialization fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix issue with removing inputs for ops

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* FlatBuffers NDArray deserialization fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* FlatBuffers NDArray deserialization fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Final cleanup/polish

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-10-23 21:19:50 +11:00
raver119 7abc574eeb
Snapshot update (#8194)
* fix double consumption of rng on cpu

Signed-off-by: raver119 <raver119@gmail.com>

* Shyrma docs (#222)

* - documenting and profiling matrix_set_diag cuda kernel

Signed-off-by: Yurii <yurii@skymind.io>

* - correct formula of pnorm pooling in cuda 2d/3d kernels
- remove helper matrix_diag which duplicates work of helper matrix_set_diag

Signed-off-by: Yurii <yurii@skymind.io>

* cublasHandle sharing + lock

Signed-off-by: raver119 <raver119@gmail.com>

* cublasHandle sharing + lock

Signed-off-by: raver119 <raver119@gmail.com>

* Documentation from serialization/deserialization in NLP (#221)

* refactoring

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Javadocs

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Javadoc fixed

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Cleanup

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* dedicated lock for getCudaCublasHandle

Signed-off-by: raver119 <raver119@gmail.com>

* Small fixes (#223)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* ELU DL4J fixes (#224)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* javadoc (#225)

Signed-off-by: Robert Altena <Rob@Ra-ai.com>

* Small test compilation fix (#226)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8182 remove spark version suffix (#227)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* [WIP] Thread safety (#229)

* sync after cublas*gemm

Signed-off-by: raver119 <raver119@gmail.com>

* mutex for CublasHelper

Signed-off-by: raver119 <raver119@gmail.com>

* don't store cublasHandle in LaunchContext, it's per-device anyway

Signed-off-by: raver119 <raver119@gmail.com>

* some printout

Signed-off-by: raver119 <raver119@gmail.com>

* check for field instead

Signed-off-by: raver119 <raver119@gmail.com>

* pew-pew

Signed-off-by: raver119 <raver119@gmail.com>

* don't release ContextBuffers until device changed

Signed-off-by: raver119 <raver119@gmail.com>

* small tweak

Signed-off-by: raver119 <raver119@gmail.com>

* some logging in sgemm

Signed-off-by: raver119 <raver119@gmail.com>

* stream sync

Signed-off-by: raver119 <raver119@gmail.com>

* some more logging

Signed-off-by: raver119 <raver119@gmail.com>

* some more error checks

Signed-off-by: raver119 <raver119@gmail.com>

* one fancy test

Signed-off-by: raver119 <raver119@gmail.com>

* one fancy test

Signed-off-by: raver119 <raver119@gmail.com>

* minor AffinityManager fix

Signed-off-by: raver119 <raver119@gmail.com>

* cudaEvent error logging improvement

Signed-off-by: raver119 <raver119@gmail.com>

* ConstantHelper thread safety

Signed-off-by: raver119 <raver119@gmail.com>

* - minor corrections in ConstantTadHelper

Signed-off-by: Yurii <yurii@skymind.io>

* ConstantShapeHelper thread safety

Signed-off-by: raver119 <raver119@gmail.com>

* ConstantTadHelper.cu updated

Signed-off-by: raver119 <raver119@gmail.com>

* logging off

Signed-off-by: raver119 <raver119@gmail.com>

* logging off

Signed-off-by: raver119 <raver119@gmail.com>
2019-09-03 22:02:02 +03:00
Alexander Stoyakin f414239ed5
Types fixed (#205)
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
2019-08-30 16:18:41 +03:00
Alex Black 3f3b676ce5
DL4J Fixes (#204)
* Fix issue with recently introduced exception handling system in MultiLayerNetwork/ComputationGraph

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for SpaceToBatch layer

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8133 DL4J SpaceToBatch gradient fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-30 23:00:53 +10:00
Alex Black dcc2baa676
Version upgrades (#199)
* DataVec fixes for Jackson version upgrade

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J jackson updates + databind version 2.9.9.3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Shade snakeyaml along with jackson

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Version fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Switch DataVec legacy JSON format handling to mixins

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Next set of fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup for legacy JSON mapping

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Upgrade commons compress to 1.18; small test fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* New Jackson backward compatibility for DL4J - Round 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* New Jackson backward compatibility for DL4J - Round 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More fixes, all but legacy custom passing

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Provide an upgrade path for custom layers for models in pre-1.0.0-beta JSON format

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Legacy deserialization cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small amount of polish - legacy JSON

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Upgrade guava version

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* IEvaluation legacy format deserialization fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Upgrade play version to 2.7.3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Update nd4j-parameter-server-status to new Play API

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Update DL4J UI for new play version

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More play framework updates

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove Spark 1/2 adapter code from DataVec

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* datavec-spark dependency cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 4

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Test fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Another fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Breeze upgrade, dependency cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add Scala 2.12 version to pom.xml

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* change-scala-versions.sh - add scala 2.12, remove 2.10

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Move Spark version properties to parent pom (now that only one spark version is supported)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DataVec Play fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* datavec play dependency fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Clean up old spark/jackson stuff

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup jackson unused dependencies

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Dropping redundant dependency

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Removed scalaxy dependency

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* DataVec fixes for Jackson version upgrade

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J jackson updates + databind version 2.9.9.3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Shade snakeyaml along with jackson

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Version fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Switch DataVec legacy JSON format handling to mixins

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Next set of fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup for legacy JSON mapping

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Upgrade commons compress to 1.18; small test fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* New Jackson backward compatibility for DL4J - Round 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* New Jackson backward compatibility for DL4J - Round 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More fixes, all but legacy custom passing

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Provide an upgrade path for custom layers for models in pre-1.0.0-beta JSON format

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Legacy deserialization cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small amount of polish - legacy JSON

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Upgrade guava version

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* IEvaluation legacy format deserialization fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Upgrade play version to 2.7.3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Update nd4j-parameter-server-status to new Play API

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Update DL4J UI for new play version

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More play framework updates

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove Spark 1/2 adapter code from DataVec

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* datavec-spark dependency cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J spark updates, pt 4

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Test fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Another fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Breeze upgrade, dependency cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add Scala 2.12 version to pom.xml

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* change-scala-versions.sh - add scala 2.12, remove 2.10

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Move Spark version properties to parent pom (now that only one spark version is supported)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DataVec Play fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* datavec play dependency fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Clean up old spark/jackson stuff

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Cleanup jackson unused dependencies

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Add shaded guava

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Dropping redundant dependency

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Removed scalaxy dependency

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Ensure not possible to import pre-shaded classes, and remove direct guava dependencies in favor of shaded

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* ND4J Shaded guava import fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DataVec and DL4J guava shading

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Arbiter, RL4J fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Build fixed

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Fix dependency

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* Fix bad merge

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Jackson shading fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Set play secret, datavec-spark-inference-server

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for datavec-spark-inference-server

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Arbiter fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Arbiter fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small test fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-30 14:35:27 +10:00
Ryan Nett 70c916c55b
Beta4 Regression tests (#196)
* beta4 regression tests

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes, dtype tests

Signed-off-by: Ryan Nett <rnett@skymind.io>

* test fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fix

Signed-off-by: Ryan Nett <rnett@skymind.io>
2019-08-29 17:31:57 -07:00
Serhii Shepel 0463ee4eba Fix backend dependencies for tests (#189) 2019-08-29 12:54:48 +09:00
Ryan Nett f40bdcf885 Dl4j LSTM and Dropout CuDNN fallback and options (#152)
* add fallback for Conv layer activation

Signed-off-by: Ryan Nett <rnett@skymind.io>

* add fallback and config option for LSTM layers

Signed-off-by: Ryan Nett <rnett@skymind.io>

* add fallback option and setting for dropout

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fix comments and error messages

Signed-off-by: Ryan Nett <rnett@skymind.io>

* move helper fail count to layer instance

Signed-off-by: Ryan Nett <rnett@skymind.io>

* ignore helperCountFail for equals and json

Signed-off-by: Ryan Nett <rnett@skymind.io>

* typo fix (MLK -> MKL)

Signed-off-by: Ryan Nett <rnett@skymind.io>

* add MKLDNN to error messages

Signed-off-by: Ryan Nett <rnett@skymind.io>

* add helperAllowFallback to builders, deprecate cudnnAllowFallback

Signed-off-by: Ryan Nett <rnett@skymind.io>
2019-08-29 13:05:01 +10:00
Alex Black 8e3d569f18
Small fixes to subsampling layer (#158)
Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-23 22:50:07 +10:00
Alex Black 80d35377d4
SameDiff cleanup and fixes (#150)
* Cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* SDVariable no longer extends DifferentialFunction

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8123 Remove cloning library to avoid 'illegal reflective access' warnings

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8095 Make Pooling3D abstract, fix flatbuffers serialization issue

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8117 WordVectorSerializer deprecated method javadoc

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Final fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* One more

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-23 15:09:53 +10:00
Alex Black 9c2bfc9863
Various fixes (DL4J, ND4J) (#147)
* Import fixes, IsMax dtype calc, small test fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* SubsamplingLayer fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J - SpaceToBatch layer updates

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-22 16:16:03 +10:00
Ryan Nett 11bddb3825 SameDiff: Listener changes and training api update (#99)
* example api

Signed-off-by: Ryan Nett <rnett@skymind.io>

* Lambda based evaluation

Signed-off-by: Ryan Nett <rnett@skymind.io>

* lambda test

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* partial fixes, use get-variable listener framework, example EvaluationListener

Signed-off-by: Ryan Nett <rnett@skymind.io>

* javadoc fix and newInstance implementations

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fit and evaluate methods with validation data (for fit) and listeners

Signed-off-by: Ryan Nett <rnett@skymind.io>

* output method overloads + listener args

Signed-off-by: Ryan Nett <rnett@skymind.io>

* history and evaluation helpers

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* more fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* FitConfig and added getters and setters

Signed-off-by: Ryan Nett <rnett@skymind.io>

* javadocs

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes, javadoc, added activations to history, added latest activation listener

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes, start of tests

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes and updates

Signed-off-by: Ryan Nett <rnett@skymind.io>

* newInstance fixes, tests

Signed-off-by: Ryan Nett <rnett@skymind.io>

* test fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* javadocs, getters with SDVariable overrides, CustomEvaluation fix

Signed-off-by: Ryan Nett <rnett@skymind.io>

* more operation config classes (evaluation, output, exec/single batch output), fix custom eval tests

Signed-off-by: Ryan Nett <rnett@skymind.io>

* merge fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fix

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes, most old fit/evaluate/output methods use the builders

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* numerous fixes/cleanup

Signed-off-by: Ryan Nett <rnett@skymind.io>

* fixes

Signed-off-by: Ryan Nett <rnett@skymind.io>

* javadoc

Signed-off-by: Ryan Nett <rnett@skymind.io>

* Polish round 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Round 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Formatting + round 3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Round 4

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-10 15:30:31 +10:00
Alex Black edb71bf46f
Add SameDiff exec debugging listener + few fixes (#104)
* First pass on SameDiff op exec debug listener

Signed-off-by: Alex Black <blacka101@gmail.com>

* #7555 DL4J helpers - don't fall back on builtin for op profiler exceptions

Signed-off-by: Alex Black <blacka101@gmail.com>

* Exec debugging listener + fixes

Signed-off-by: Alex Black <blacka101@gmail.com>

* Fix import counts for TF ops in OpValidationSuite

Signed-off-by: Alex Black <blacka101@gmail.com>

* Fix bad DL4J test configuration

Signed-off-by: Alex Black <blacka101@gmail.com>

* Exec debugging listener polish

Signed-off-by: Alex Black <blacka101@gmail.com>

* Small fix

Signed-off-by: Alex Black <blacka101@gmail.com>

* Another fix

Signed-off-by: Alex Black <blacka101@gmail.com>
2019-08-07 17:18:29 +10:00
Alex Black e18e2dc014 Various ND4J/DL4J fixes (#90)
* Deprecate Old*Op instances

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8063 #8054 Broadcast exceptions + cleanup inplace ops

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Remove bad test condition

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #7993 Fix shape function issue in crop_and_resize op

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* DL4J SameDiff lambda layer fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8029 Fix for pnorm backprop math

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-05 11:25:48 +10:00
Alex Black b95417f7c5 Various ND4J/DL4J fixes and improvements (#87)
* Reshape and reallocate - small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Reshape and reallocate - small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #6488 ElementWiseVertex broadcast support

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Constructors and broadcast supported it Transforms.max/min

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8054 ElementWiseVertex now supports broadcast inputs

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8057 Nd4j.create overload dtype fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #7551 ND4J Shape validation fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-05 11:24:20 +10:00
Alex Black fad8da878f Various DL4J/ND4J fixes (#81)
* #7954 Force refresh of UI when switching tabs on overview page

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8017 Concurrent modification exception (synchronize) fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8033 Don't initialize updater in middle of writing memory crash dump

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8208 Fix shape checks for ND4J int[] creator methods

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #6385 #7992 Keras import naming fixes + cleanup

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #8016 Upsampling3D - add NDHWC format support

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-08-05 11:21:23 +10:00
Alex Black c3e684d648 DL4J: Switch subsampling layer to custom ops; DL4J samediff mask fix (#67)
* SpaceToDepth layer fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Switch subsampling layer to use dynamiccustomop + add legacy mode support for beta4 and earlier models

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small subsampling fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Subsampling layer eps fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Handle 'no mask provided this minibatch' case for DL4J SameDiff layers

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small comment/javadoc fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-07-20 23:17:28 +10:00
Alex Black d94bc7257c Various fixes (#65)
* #7977 deprecate legacy MultiLayerNetwork/ComputationGraph.params(boolean) method

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix bad test

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix Histogram mapping

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix incorrect name handling in DifferentialFunction

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* More fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Histogram fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Proper histogram fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* ToString/NDArrayStrings fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* JSON UTF8 serialization fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-07-20 23:16:41 +10:00
Alex Black 1170827c18 Merge master to upstream (#7945)
* Shugeo strided slice zeros (#14)

* Modified strided_slice op to properly work with empty-like shapes.

* Fixed test for reduce_mean with empty-like input.

* [WIP] Last merge (#15)

* correct logsoftmax looss (#2)

* Small SameDiff listener fix (#4)

* Various fixes (#6)

* #7839 Fix for asXMatrix and tests

* #7866 EmbeddingSequenceLayer dtype fix + test

* #7856 SameDiff save/load stream methods

* #7859 RegressionEvaluation rank 4 fix + tests + axis configuration

* EvaluationBinary 3d/4d

* More evaluation 3d/4d tests

* #7847 Evaluation empty checks

* Small test ifx

* #7848 Fix median edge case

* Improve DL4J samediff layer tests

* [WIP] FastText wrapper implemented (#8)

* FastText implemented

* Some fixes

* Fix shapes for wordsNearest

* Validation of input vectors

* Fixes

* Fixed test

* Thread tagged

* Some tweaks

* setContextClassLoader for DeallocatorServiceThread

* Numpy format tests (#1)

* Various fixes (#11)

* #7852 SameDiff gather fix

* #7892 SameDiff placeholder to constant conversion

* #7890 validate input rank for MLN/CG init methods

* Fix broken permute shape calculation

* Permute and gather fixes

* Tests

* #7850 LogSumExp fix + test

* Handful of test fixes

* Empty arrays with non-scalar shapes (#10)

* minor rearrangements for lambdas

* empty tensors with non-scalar shapes

* numpy empty tensors with non-scalar shapes

* few more empty tweaks

* Small fixes

* conv3d signature update

* micro fix in batchnorm mkldnn

* Import fixes

* Fix

* MKL-DNN update

* Small fill fix

* fill with empty input + test

* Fixes

* Small error improvement

* Fix

* one special test

* couple of fixes for lstm

* Rewrite TFGraphMapper.getNDArrayFromTensor to be maintainable and less error prone

* Fixes

* FP16

* Unsigned

* BFloat16

* Fill op - empty tweaks

* - couple of fixes for empty arrays construction
- stack updated

* strided slice fix

* one transform test

* provide method for reducing shapeInfo in case of input array is empty

* Fixed reduceAlongDimensions to use empty input properly.

* couple of broadcast tests

* couple of tests broadcast tests + tweak to make them pass

* add check of non-empty to methods producing sub-arrays

* Fixed reshapeC with zeros in shape.

* complete empty check in reduce_... legacy ops

* Concat and cumsum/prod

* Tweak to empty shape inference on import

* add empty check to the rest of reduce legacy ops

* one more test

* correct typo in evalReduceShapeInfoEmpty

* Added tests for reduce_* ops to tests with zero shapes.

* few more tests for empty reductions

* Fixed strided_slice op with empty case and tests.

* one more empty reduction test

* Fixed strided_slice test.

* add empty check to NDArray::reshapei

* infOrMax

* empty min/max with infinity tests

* made unstack working correctly with empty arrays

* few IndexReduce tests + tweaks for empty shapes

* add test for empty concat

* few tests fixed

* Validation fix for reductions on empty shapes

* Reverse fix

* Reduction shape calc fixes

* SameDiff.generateOutputVariable: don't use shape function to determine number of outputs

* Range fix

* - NDArray constructor updated for scalars/empty arrays
- few tests fixed

* More fixes

* Empty creator fixes

* concat fix

* concat fix

* TF import tests: allow 'both all NaN' and 'both all inf' to pass

* Slice, zero fraction, and reshape fixes

* transpose, gather

* Zero fraction

* scalar cast fix

* Empty reduction axis support

* few more tests fixed

* Fixed input checks conforming with TF for concat op and tests.

* few tests fixed

* matmul scalar shape fix

* Fixed checkout for data type and scalarity with concat to allow non-empty scalars with vector concats.

* broadcast bool fix

* few more tests

* few more tests

* correct evalReduceShapeInfoEmpty

* argmax/argmin + tests

* one more empty edge case + one more test

* argmax/argmin/realdiv_bp tweaks

* empty reshape test + fix

* Helper fixes

* Small fixes

* Gather test fix

* Gather test fix

* Small fixes

* reduce scalar zero values

* scalar mean workaround

* Remove debug code

* along dim mean workaround

* one more test

* - equalsTo() tweak for empty arrays
- one more test

* broadcast tweaks

* [WIP] Fixing outstanding issues for NLP (#9)

* Avoid using not-inited objects

* Test fixed.

* Redundant method avoided for models like FastText

* KMeans++ implementation

* KMeans++ implementation

* Disable parallel execution

* KMeans++

* Tests

* Dev branch merge (#16)

* SameDiff: convertDataType and gradient check util improvements (#12)

* GradCheck util improvements

* StopGradient constructor + test

* SameDiff: Add datatype conversion

* Javadoc and add DataType.isNumerical()

* Small fix

* Fix SameDiff TF import test cases intermediate naming (workaround for bad default)

* TFGraphTestAllHelper: check intermediates in execution order

* Add missing debug listener

* [WIP] lstmBlock fix + other changes (#13)

- fixes lstmBlock issue
- changes NDArray method reshape(), permute(), transpose() by making them return instance instead of pointer
- CheckNumerics op
- fixes for ReduceBool IsInfOrNan & IsFinite

* Small test fix

* CheckNumerics op wrapper

* Fix some issues on master (#17)

* Fix DataVec test issue

* Fix issue with dl4j SameDiff output layer

* Dtype fix for lambda layers

* #7912 BertIterator dtype fix (use float32 not global default)

* [WIP] Next set of CUDA stuff (#7)

New CUDA implementations and improvements

* bad file

* Dev branch master merge (#23)

* SameDiff: convertDataType and gradient check util improvements (#12)

* GradCheck util improvements

* StopGradient constructor + test

* SameDiff: Add datatype conversion

* Javadoc and add DataType.isNumerical()

* Small fix

* Fix SameDiff TF import test cases intermediate naming (workaround for bad default)

* TFGraphTestAllHelper: check intermediates in execution order

* Add missing debug listener

* [WIP] lstmBlock fix + other changes (#13)

- fixes lstmBlock issue
- changes NDArray method reshape(), permute(), transpose() by making them return instance instead of pointer
- CheckNumerics op
- fixes for ReduceBool IsInfOrNan & IsFinite

* Small test fix

* CheckNumerics op wrapper

* Compatibility of deserialization (#18)

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* SameDiff: add activation gradient checking support for debugging (#19)

* SameDiff gradient checker: first pass on activation gradient checks

* Fixes + tests for activation gradient checking

* Javadoc

* [WIP] Some nd4j data type corrections (#20)

* Adjust data type

* Set correct Data type.

* Size of proper data type.

* fix averaged cpu load (#22)

* SameDiff ops, TF import and fixes (#24)

* CheckNumerics tests + fixes + misc fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fake quant

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fixes

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* FakeQuantWithMinMaxArgs

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* CheckNumerics fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix libnd4j ALL_INTS and ALL_FLOATS declaration (uint and bfloat types)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Javadoc

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Exception tweak

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix for out of scope stack allocated var use

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Ignores

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Ignore for known failing test (already logged issue)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Merge upstream to fork (#25)

* Add thousand-separator commas to TotalParams (#7915)

* Add thousand-separator commas to TotalParams

The number of parameters can be quite large, and it would help the reading of the summary printout to have the TotalParams column & values at the bottom have thousand-separator-commas in them.

* Add thousand-separator commas to MultiLayerNetwork

Corresponding change to MultiLayerNetwork

Signed-off-by: Jxtps Jxtps <jxtps435@gmail.com>

* Update contributing and issue/PR templates (#7934)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Fix link to AdaDelta paper (#7942)

Fix link to AdaDelta paper hosted on matthewzeiler.com

Signed-off-by: Jxtps

* Fixes, and ignores for known/logged failing issues (#7943)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* SameDiff + DL4J/SameDiff: Multiple fixes (#28)

* #7919 HDF5 attribute buffer length fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #7909 Arbiter constructor exception ux improvements

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #7925 RNN output layer length checks

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #7939 Add listener for validating inputs are not incorrectly modified

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* #7939 Integrate NonInplaceValidationListener into tests

* #7844 DL4J SameDiff fixes for variable minibatch size

* DL4J SameDiff fixes - ensure gradient for input placeholder is available

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Tweaks to ExternalErrorsFunction - use placeholders, make more robust

* Another fix

* More fixes

* More SameDiff/DL4J fixes

* Scope out scalar array creation in BaseScalarOp

* Remove debug code

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* [WIP] Final dev branch merge (#29)

* SameDiff: convertDataType and gradient check util improvements (#12)

* GradCheck util improvements

* StopGradient constructor + test

* SameDiff: Add datatype conversion

* Javadoc and add DataType.isNumerical()

* Small fix

* Fix SameDiff TF import test cases intermediate naming (workaround for bad default)

* TFGraphTestAllHelper: check intermediates in execution order

* Add missing debug listener

* [WIP] lstmBlock fix + other changes (#13)

- fixes lstmBlock issue
- changes NDArray method reshape(), permute(), transpose() by making them return instance instead of pointer
- CheckNumerics op
- fixes for ReduceBool IsInfOrNan & IsFinite

* Small test fix

* CheckNumerics op wrapper

* Compatibility of deserialization (#18)

Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>

* SameDiff: add activation gradient checking support for debugging (#19)

* SameDiff gradient checker: first pass on activation gradient checks

* Fixes + tests for activation gradient checking

* Javadoc

* [WIP] Some nd4j data type corrections (#20)

* Adjust data type

* Set correct Data type.

* Size of proper data type.

* fix averaged cpu load (#22)

* [WIP] Multiple dataset iterators (#27)

* Splitting dataset into arbitrary number

* Fixes

* Multiple split of iterator

* Test

* Test

* Some fixes

* signature change

* one more tweak

Signed-off-by: raver119 <raver119@gmail.com>

* one more test for sequential use of DataSetIteratorSplitter

Signed-off-by: raver119 <raver119@gmail.com>

* Fixes

* Fixes

* one more test for Alexander

Signed-off-by: raver119 <raver119@gmail.com>

* Some fixes

* Some fixes

* one more test for Alexander

Signed-off-by: raver119 <raver119@gmail.com>

* minor test fix

Signed-off-by: raver119 <raver119@gmail.com>

* Some fixes

* Some fixes

* couple of assertions tweaked

Signed-off-by: raver119 <raver119@gmail.com>

* MDS splitter test :/

Signed-off-by: raver119 <raver119@gmail.com>

* Minor refactoring

* Multi dataset

* Some fixes

* More tests

* Small number of test fixes/improvements (failures on CI) (#31)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* [WIP] More CUDA stuff (#26)

* initial commit

Signed-off-by: raver119 <raver119@gmail.com>

* LRN BP CUDA

Signed-off-by: raver119 <raver119@gmail.com>

* less memory

Signed-off-by: raver119 <raver119@gmail.com>

* Fixed bug with crop_and_resize op helper.

* get rid of unnecessary index-calculation dunction

Signed-off-by: Yurii <yurii@skymind.io>

* Fixed sort with nth_element cuda-based helper.

* Refactored nth_element.

* Refactored nth_element op and tests.

* Modified usage of dim array with sortTad routine.

* Refactored main routine of helper for non_max_image_suppression op.

* non_max_image_suppression op helper with cuda kernel implementation. Initial revision.

* fix vol2col cuda kernel

* meh

Signed-off-by: raver119 <raver119@gmail.com>

* topK concept

Signed-off-by: raver119 <raver119@gmail.com>

* unsorted topK with scanWitdh of 1

Signed-off-by: raver119 <raver119@gmail.com>

* correct vol2col tests

* sorted/unsorted topK

Signed-off-by: raver119 <raver119@gmail.com>

* implementation and fixing col2im/col2vol

* Corrected usage flags with input/output with reverse op.

* dup is const now

Signed-off-by: raver119 <raver119@gmail.com>

* percentile op

Signed-off-by: raver119 <raver119@gmail.com>

* group tests for mapool2d

Signed-off-by: Yurii <yurii@skymind.io>

* special test for george

Signed-off-by: raver119 <raver119@gmail.com>

* less threads for sortTad

Signed-off-by: raver119 <raver119@gmail.com>

* provide conv2d for cuda

Signed-off-by: Yurii <yurii@skymind.io>

* remove auther in sort tad kernel code

Signed-off-by: Yurii <yurii@skymind.io>

* provide depthwise_conv2d for cuda

Signed-off-by: Yurii <yurii@skymind.io>

* - max_pooling_with_argmax
- null check for special use

Signed-off-by: raver119 <raver119@gmail.com>

* dts cuda

Signed-off-by: raver119 <raver119@gmail.com>

* provide sconv2d for cuda

Signed-off-by: Yurii <yurii@skymind.io>

* std cuda

Signed-off-by: raver119 <raver119@gmail.com>

* Refactored non_max_suppression op to conform TF implementation.

* Improved suppression helper.

* provide pooling3d for cuda

Signed-off-by: Yurii <yurii@skymind.io>

* minor lstm rearrangements

Signed-off-by: raver119 <raver119@gmail.com>

* more of minor lstm rearrangements

Signed-off-by: raver119 <raver119@gmail.com>

* (bi)dynamic_rnn

Signed-off-by: raver119 <raver119@gmail.com>

* templates init order

Signed-off-by: raver119 <raver119@gmail.com>

* Refactored non_max_suppression op.

* Added cuda kernel for non_max_suppression.

* CPU sort by key/value

Signed-off-by: raver119 <raver119@gmail.com>

* CPU sort TAD by key/value

Signed-off-by: raver119 <raver119@gmail.com>

* CPU sort TAD by key/value tests

Signed-off-by: raver119 <raver119@gmail.com>

* Eliminate compiler error with cuda implementation.

* - repaired gradCheck in cuda
- provide conv2d_bp for cuda

Signed-off-by: Yurii <yurii@skymind.io>

* missed signature

Signed-off-by: raver119 <raver119@gmail.com>

* provide depthwise_conv2d_bp for cuda

Signed-off-by: Yurii <yurii@skymind.io>

* Implementation of lup helper with cuda kernel. Initial commit.

* further work on backprops for convolutions

Signed-off-by: Yurii <yurii@skymind.io>

* CUDA linear sort by key/val

Signed-off-by: raver119 <raver119@gmail.com>

* CUDA tad sort by key/val

Signed-off-by: raver119 <raver119@gmail.com>

* start providing of backprop for pooling2d/3d

Signed-off-by: Yurii <yurii@skymind.io>

* Added atomicAdd for bool datatype.

* dynamic partition concept

Signed-off-by: raver119 <raver119@gmail.com>

* dynamic partition concept

Signed-off-by: raver119 <raver119@gmail.com>

* dynamic partition scalar CUDA

Signed-off-by: raver119 <raver119@gmail.com>

* important comment

Signed-off-by: raver119 <raver119@gmail.com>

* fix pooling2d/3d backprop helpers

Signed-off-by: Yurii <yurii@skymind.io>

* Added non-linear test with dynamic_partition.

* Improved test for dynamic_partition.

* dynamic_partition TAD concept

Signed-off-by: raver119 <raver119@gmail.com>

* - dynamic_partition TAD CUDA impl
- dynamic_partition TAD CPU fix

Signed-off-by: raver119 <raver119@gmail.com>

* - rewrite cpu code for usampling2d/3d
- write cuda code for usampling2d/3d

Signed-off-by: Yurii <yurii@skymind.io>

* dynamic_stitch CUDA vector case

Signed-off-by: raver119 <raver119@gmail.com>

* dynamic_stitch CUDA TAD case concept

Signed-off-by: raver119 <raver119@gmail.com>

* dynamic_stitch CUDA TAD case impl

Signed-off-by: raver119 <raver119@gmail.com>

* Added tests for dynamic_stitch 3D-4D cases.

* minor tests tweaks

Signed-off-by: raver119 <raver119@gmail.com>

* Fixed type check for dynamic stitch.

* min/max bp

Signed-off-by: raver119 <raver119@gmail.com>

* rewrite code for upsampling2d/3d cpu

Signed-off-by: Yurii <yurii@skymind.io>

* reduce min/max/norm_max bp

Signed-off-by: raver119 <raver119@gmail.com>

* lup implementation. Additional enhancements.

* provide code for upsamling2d/3d backprop

Signed-off-by: Yurii <yurii@skymind.io>

* weightedCrossEntropyWithLogits

Signed-off-by: raver119 <raver119@gmail.com>

* Fixed template math atomicMul for 64bit ints.

* Refactored dynamic_partition_bp op.

* inverseBroadcast fix

Signed-off-by: raver119 <raver119@gmail.com>

* DynamicPartitionBP test datatype fixed.

* - nd4j_atomicMul Windows fix
- cpu/NDArrayLambda.hpp excluded from CUDA

Signed-off-by: raver119 <raver119@gmail.com>
2019-06-27 18:37:04 +03:00
Alex Black cae4fc9760
Fixes, and ignores for known/logged failing issues (#7943)
Signed-off-by: AlexDBlack <blacka101@gmail.com>
2019-06-25 12:07:28 +10:00
Alex Black 68ea5f3688
Dev branch merge: dev_20190606 (#7904)
* correct logsoftmax looss (#2)

* Small SameDiff listener fix (#4)

* Various fixes (#6)

* #7839 Fix for asXMatrix and tests

* #7866 EmbeddingSequenceLayer dtype fix + test

* #7856 SameDiff save/load stream methods

* #7859 RegressionEvaluation rank 4 fix + tests + axis configuration

* EvaluationBinary 3d/4d

* More evaluation 3d/4d tests

* #7847 Evaluation empty checks

* Small test ifx

* #7848 Fix median edge case

* Improve DL4J samediff layer tests

* [WIP] FastText wrapper implemented (#8)

* FastText implemented

* Some fixes

* Fix shapes for wordsNearest

* Validation of input vectors

* Fixes

* Fixed test

* Thread tagged

* Some tweaks

* setContextClassLoader for DeallocatorServiceThread

* Numpy format tests (#1)

* Various fixes (#11)

* #7852 SameDiff gather fix

* #7892 SameDiff placeholder to constant conversion

* #7890 validate input rank for MLN/CG init methods

* Fix broken permute shape calculation

* Permute and gather fixes

* Tests

* #7850 LogSumExp fix + test

* Handful of test fixes

* Empty arrays with non-scalar shapes (#10)

* minor rearrangements for lambdas

* empty tensors with non-scalar shapes

* numpy empty tensors with non-scalar shapes

* few more empty tweaks

* Small fixes

* conv3d signature update

* micro fix in batchnorm mkldnn

* Import fixes

* Fix

* MKL-DNN update

* Small fill fix

* fill with empty input + test

* Fixes

* Small error improvement

* Fix

* one special test

* couple of fixes for lstm

* Rewrite TFGraphMapper.getNDArrayFromTensor to be maintainable and less error prone

* Fixes

* FP16

* Unsigned

* BFloat16

* Fill op - empty tweaks

* - couple of fixes for empty arrays construction
- stack updated

* strided slice fix

* one transform test

* provide method for reducing shapeInfo in case of input array is empty

* Fixed reduceAlongDimensions to use empty input properly.

* couple of broadcast tests

* couple of tests broadcast tests + tweak to make them pass

* add check of non-empty to methods producing sub-arrays

* Fixed reshapeC with zeros in shape.

* complete empty check in reduce_... legacy ops

* Concat and cumsum/prod

* Tweak to empty shape inference on import

* add empty check to the rest of reduce legacy ops

* one more test

* correct typo in evalReduceShapeInfoEmpty

* Added tests for reduce_* ops to tests with zero shapes.

* few more tests for empty reductions

* Fixed strided_slice op with empty case and tests.

* one more empty reduction test

* Fixed strided_slice test.

* add empty check to NDArray::reshapei

* infOrMax

* empty min/max with infinity tests

* made unstack working correctly with empty arrays

* few IndexReduce tests + tweaks for empty shapes

* add test for empty concat

* few tests fixed

* Validation fix for reductions on empty shapes

* Reverse fix

* Reduction shape calc fixes

* SameDiff.generateOutputVariable: don't use shape function to determine number of outputs

* Range fix

* - NDArray constructor updated for scalars/empty arrays
- few tests fixed

* More fixes

* Empty creator fixes

* concat fix

* concat fix

* TF import tests: allow 'both all NaN' and 'both all inf' to pass

* Slice, zero fraction, and reshape fixes

* transpose, gather

* Zero fraction

* scalar cast fix

* Empty reduction axis support

* few more tests fixed

* Fixed input checks conforming with TF for concat op and tests.

* few tests fixed

* matmul scalar shape fix

* Fixed checkout for data type and scalarity with concat to allow non-empty scalars with vector concats.

* broadcast bool fix

* few more tests

* few more tests

* correct evalReduceShapeInfoEmpty

* argmax/argmin + tests

* one more empty edge case + one more test

* argmax/argmin/realdiv_bp tweaks

* empty reshape test + fix

* Helper fixes

* Small fixes

* Gather test fix

* Gather test fix

* Small fixes

* reduce scalar zero values

* scalar mean workaround

* Remove debug code

* along dim mean workaround

* one more test

* - equalsTo() tweak for empty arrays
- one more test

* broadcast tweaks
2019-06-15 21:34:34 +10:00
Alex Black 32e5cc1945
First round of runtime test improvements (#7875)
* Capsnet test runtime improvements

* Slow test speedups

* Next round of test speed improvements

* More test improvements

* Improve test speed

* Next round of test speedups

* Another round

* More test speedups

* Another round

* Another round of test speedups

* Another round of speedups...

* CuDNN test speedups + more tests extending BaseDL4JTest

* Minor fix + more BaseDL4JTest use in other modules
2019-06-13 20:40:40 +10:00
skymindops b5f0ec072f Eclipse Migration Initial Commit 2019-06-06 15:21:15 +03:00