* - add causal mode of padding to convolutions
Signed-off-by: Yurii <iuriish@yahoo.com>
* - add additional tests for causal conv1d
Signed-off-by: Yurii <iuriish@yahoo.com>
* - add causal mode for cuda conv kernels
Signed-off-by: Yurii <iuriish@yahoo.com>
* Java side of Conv1D changes
Signed-off-by: raver119 <raver119@gmail.com>
* Add Conv1DDerivative op
Signed-off-by: Alex Black <blacka101@gmail.com>
* Causal Conv1D gradient checks
Signed-off-by: Alex Black <blacka101@gmail.com>
* Tweaks
Signed-off-by: Alex Black <blacka101@gmail.com>
* - add causal padding mode to conv2d_bp
Signed-off-by: Yurii <iuriish@yahoo.com>
* More thorough causal conv1d tests
Signed-off-by: Alex Black <blacka101@gmail.com>
* Implementation for non_max_suppression_v3 was added. Initial version
* Added check for overcome threshold.
* Added definition for V3 method.
* java remapping for NonMaxSuppressionV3
Signed-off-by: raver119 <raver119@gmail.com>
* Fixed proporly processing of an empty output and test.
* Refactored op to less threshold data to float.
* Implemented cuda-based helper for non_max_suppression_v3 op.
* Fixed fake_quant_with_min_max_vars op.
* Fixed tests with float numbers.
* - assert now stops execution
- sortByKey/sortByValue now have input validation
Signed-off-by: raver119 <raver119@gmail.com>
* missing var
Signed-off-by: raver119 <raver119@gmail.com>
* Fixed proper processing for zero max_size inputs.
* Refactored kernel callers.
* Fixed return statement for logdet op helper.
* Refactored unsorted segment SqrtN op.
* get back 8 tail bytes on CUDA
Signed-off-by: raver119 <raver119@gmail.com>
* Refactored segment prod ops and helpers for cuda and tests.
* Additional test.
* CudaWorkspace tests updated for 8 tail bytes
Signed-off-by: raver119 <raver119@gmail.com>
* special atomic test
Signed-off-by: raver119 <raver119@gmail.com>
* atomicMul/atomicDiv fix for 16bit values
Signed-off-by: raver119 <raver119@gmail.com>
* Eliminated waste prints.
* Update shaded Jackson version to 2.10.1
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Remove no longer needed scala compiler plugin from UI
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Fix op name for BitwiseAnd op
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* TimeDistributedLayer mask array fix + test
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Corrected input checking and tests for bitcast op.
* Fixed an issue with non_max_suppression form generation and processing with score threshold given.
* Fixed bilinear resize kernel and tests.
* push for Serhii
Signed-off-by: raver119 <raver119@gmail.com>
* Added test for nearest_neighbor resize with int input.
* Added data type check for input/output match.
* Eliminate error in macros.
* Improved output message for type checking.
* Fixed input/output types for op.
* Eliminated waste logging.
* Refactored resize_bilinear helper for multithreading for cpu platform.
* Cosmetic changes only.
* Fixed error for string substitution.
* Skip test for cbow_batch with cuda.
* fix for resizeNearestNeighbor output dtype
Signed-off-by: raver119 <raver119@gmail.com>
* Refactored non_max_suppression helper.
* Refactored shape generation and input handling.
* Added additional test.
* - create op
- skip exec for empty inputs for non_max_suppression
- EmptyHandling idea
Signed-off-by: raver119 <raver119@gmail.com>
* Create op and mapping for it
Signed-off-by: raver119 <raver119@gmail.com>
* Added implementation files for image_resize and resize_bicubic ops.
* Image resize and image.resize_bicubic ops implementation. Initial revision.
* Minor fix
* Some TF imports disabled.
* Finished with infrastructure development for image.resize_bilinear op and image_resizo op implementation.
* Refactored resize methods.
* Added processing for Mitchelcubic algorithm.
* adjust_contrast
* Small fix for TF import expected value loading when variable name starts with the test name
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Tests
* Tests added.
* Removed tf names absent in mapping.
* Some fixes.
* Small fixes
* Minor change
* Some failing tests.
* Disable failed test
* Ignore some tests
* Fix import class mapping
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Fix float property mapping (flatbuffers)
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Override equality function for model 'dropout'
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Fail tests
* Failed tests ignored temporarily.
* Minor fixes
* Small fix
* Conflict resolved
* Default implementations of tensorflowName and onnxName
* one range test
Signed-off-by: raver119 <raver119@gmail.com>
* few Context convenience singatures
Signed-off-by: raver119 <raver119@gmail.com>
* one more range test
Signed-off-by: raver119 <raver119@gmail.com>
* "range" "fix"
Signed-off-by: raver119 <raver119@gmail.com>
* adjuct_contrast_v2 now allows scale factor to be provided via input_variable
Signed-off-by: raver119 <raver119@gmail.com>
* adjust_contrast now allows scale factor as variable too
Signed-off-by: raver119 <raver119@gmail.com>
* bitcast shape tests
Signed-off-by: raver119 <raver119@gmail.com>
* BitCast import dtype added
Signed-off-by: raver119 <raver119@gmail.com>
* few more BitCast signatures
Signed-off-by: raver119 <raver119@gmail.com>
* - platform helpers can be disabled on per-op basis now via Context::allowHelpers
- java has access to it as well
Signed-off-by: raver119 <raver119@gmail.com>
* global platform-helpers trigger
Signed-off-by: raver119 <raver119@gmail.com>
* few signatures renamed
Signed-off-by: raver119 <raver119@gmail.com>
* - few new env variables to follow
- maxThreads/masterThreads differentiation
Signed-off-by: raver119 <raver119@gmail.com>
* Javadoc update
Signed-off-by: raver119 <raver119@gmail.com>
* #8280 biasadd_bp nchw arg fixes (java side) + test
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* #8285 Concat op Java side fixes
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Concat op cpp fix - allow dynamic axis to be negative, same as static axis
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* ignores for deconv3d import tests until deconv3d_tf op is implemented
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* - write code for new batchnorm backprop
Signed-off-by: Yurii <iuriish@yahoo.com>
* - testing batchnorm backprop
Signed-off-by: Yurii <iuriish@yahoo.com>
* - write code for batchnorm backprop based on mkl dnn api
Signed-off-by: Yurii <iuriish@yahoo.com>
* - testing and fixing bugs in batchnorm_bp mkl dnn
Signed-off-by: Yurii <iuriish@yahoo.com>
* - made corrections required by reviewer
Signed-off-by: Yurii <iuriish@yahoo.com>
* - change name in java wrapper for batchnorm op
Signed-off-by: Yurii <iuriish@yahoo.com>
* update javadocs and a few method signatures
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add PRelu op
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test and fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add PRelu op
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test and fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* slightly better test
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Fixed signatures. SameDiff tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Tests fixed
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test fixed
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Small fix
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Fixed test
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* fix execBackwards training issue
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix validation not specifying outputs
Signed-off-by: Ryan Nett <rnett@skymind.io>
* another fix for validation listeners and history
Signed-off-by: Ryan Nett <rnett@skymind.io>
* tests
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add single batch dataset output methods
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Small base spark test fix; ROC toString for empty ROC
Signed-off-by: Alex Black <blacka101@gmail.com>
* More fixes
Signed-off-by: Alex Black <blacka101@gmail.com>
* SDCNN cleanup
Signed-off-by: Ryan Nett <rnett@skymind.io>
* NonNull annotations
Signed-off-by: Ryan Nett <rnett@skymind.io>
* better javadoc, NonNull fix for sconv
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update builders to fix names
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* even more fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix for null bias
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Rename flatbuffers DataType enum to DType
Signed-off-by: Alex Black <blacka101@gmail.com>
* Rename flatbuffers DataType enum to DType
Signed-off-by: Alex Black <blacka101@gmail.com>
* Updates for flatbuffers datatype enum renaming
Signed-off-by: Alex Black <blacka101@gmail.com>
* new (for java at least) backprop ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update activation functions
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add differential functions for SameDiff
Signed-off-by: Ryan Nett <rnett@skymind.io>
* deprecate old ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update correct old ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update ops backprop to use new ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* misc updates for deprecated functions (mostly Nd4j.rand w/ vararg shape)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove old imports
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Add java op class for relu derivative, and use in ACtivation ReLU
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Fix
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* - add one additional test for svd
* - provide float argument in eye op to be a type of output array
Signed-off-by: Yurii <yurii@skymind.io>
* - add cuda capability check to mmulHelper
Signed-off-by: Yurii <yurii@skymind.io>
* - make use another method for divice id evaluation
Signed-off-by: Yurii <yurii@skymind.io>
* Eye data type as T argument
Signed-off-by: raver119 <raver119@gmail.com>
* remove some unneeded java-side output shape calculations
Signed-off-by: Ryan Nett <rnett@skymind.io>
* delete Broadcast
Signed-off-by: Ryan Nett <rnett@skymind.io>
* delete Linear and Module,
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update Identity, HashCode, and NoOp
Signed-off-by: Ryan Nett <rnett@skymind.io>
* removed Cast java-side shape function, added tests and SDVariable.isEmpty
Signed-off-by: Ryan Nett <rnett@skymind.io>
* ignoring test w/ issues on master
Signed-off-by: Ryan Nett <rnett@skymind.io>
* noop needs more work, fixed BaseArithmeticBackprop and BaseDynamicTransform ops
merge in master for c++ build fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix EqualTo
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix other cond ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* "fake" ops calculateOutputShape() throws exception
Signed-off-by: Ryan Nett <rnett@skymind.io>
* use c++ shape calc for Linspace
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix exception message, move most to BaseCompatOp
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove SDVariable.isEmpty
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove commented out code
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove unneeded resolveProperties methods
Signed-off-by: Ryan Nett <rnett@skymind.io>
* final fixes, make final to prevent more from being added
Signed-off-by: Ryan Nett <rnett@skymind.io>
* gather fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* deprecate DifferentialFunction resolveProps
Signed-off-by: Ryan Nett <rnett@skymind.io>
* small fix of compiler warnings in nd4j.
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* indarray javadoc start.
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* First steps for protobuf version upgrade
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Phase 2
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Update imports to shaded protobuf
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Version fix
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Switch to single execution for protobuf codegen to work around plugin bug
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Automatically delete old PB generated files after name change
Signed-off-by: Alex Black <blacka101@gmail.com>
* Nd4j pad update
Signed-off-by: Ryan Nett <rnett@skymind.io>
* switched from guava Immutables to Collections.unmodifiableList/Map
Signed-off-by: Ryan Nett <rnett@skymind.io>
* javadoc
Signed-off-by: Ryan Nett <rnett@skymind.io>
* use new pad
Signed-off-by: Ryan Nett <rnett@skymind.io>
* conv tests use OpValidation
Signed-off-by: Ryan Nett <rnett@skymind.io>
* deconv3d overrides
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test fix for the new pad method
Signed-off-by: Ryan Nett <rnett@skymind.io>
* more test fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* more test fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* rename SameDiff function methods to op (except for the actual SameDiff function ones)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* more pad overloads, test fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test updates
Signed-off-by: Ryan Nett <rnett@skymind.io>
* conv1d test
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove Conv1D tf import (there isn't a TF conv1d op)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove numThreads from Nd4j
Signed-off-by: Ryan Nett <rnett@skymind.io>
* replace Old ops with their newer versions, deprecate ones that haven't already been deprecated
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove use of setNumThreads
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix for Reverse and ATan2
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix test for wrong equals type
Signed-off-by: Ryan Nett <rnett@skymind.io>
* well it works now
Signed-off-by: Ryan Nett <rnett@skymind.io>
* better javadocs
Signed-off-by: Ryan Nett <rnett@skymind.io>
* NonNulls
Signed-off-by: Ryan Nett <rnett@skymind.io>
* better array literal
Signed-off-by: Ryan Nett <rnett@skymind.io>
* re-add tf import stuff (will remove later)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* conv1d config load fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* partial config usage changes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove Old op classes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* config property fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* removed one too many ops
Signed-off-by: Ryan Nett <rnett@skymind.io>