* Corrected input checking and tests for bitcast op.
* Fixed an issue with non_max_suppression form generation and processing with score threshold given.
* Fixed bilinear resize kernel and tests.
* push for Serhii
Signed-off-by: raver119 <raver119@gmail.com>
* Added test for nearest_neighbor resize with int input.
* Added data type check for input/output match.
* Eliminate error in macros.
* Improved output message for type checking.
* Fixed input/output types for op.
* Eliminated waste logging.
* Refactored resize_bilinear helper for multithreading for cpu platform.
* Cosmetic changes only.
* Fixed error for string substitution.
* Skip test for cbow_batch with cuda.
* fix for resizeNearestNeighbor output dtype
Signed-off-by: raver119 <raver119@gmail.com>
* Refactored non_max_suppression helper.
* Refactored shape generation and input handling.
* Added additional test.
* - create op
- skip exec for empty inputs for non_max_suppression
- EmptyHandling idea
Signed-off-by: raver119 <raver119@gmail.com>
* Create op and mapping for it
Signed-off-by: raver119 <raver119@gmail.com>
* - get rid of some copy procedures in mmulHelper ops
Signed-off-by: Yurii <iuriish@yahoo.com>
* - further work on embedding cuda api for batched gemm (cublasGemmBatchedEx) in our mmulHelper class
Signed-off-by: Yurii <iuriish@yahoo.com>
* - further work on cuda batched gamm api
Signed-off-by: Yurii <iuriish@yahoo.com>
* - write own cuda kernel performing batched gemm
Signed-off-by: Yurii <iuriish@yahoo.com>
* missing include in MmulHelper
Signed-off-by: raver119 <raver119@gmail.com>
* - forgot to keep in code previous correct kernels for mmulNxN, since it may happen that new onw will fail for some reason in future
Signed-off-by: Yurii <iuriish@yahoo.com>
* disable old tensordot
Signed-off-by: raver119 <raver119@gmail.com>
* - rewrite cuda kernels for usualGemm and usualGemv
Signed-off-by: Yurii <iuriish@yahoo.com>
* - profiling mmul helpers
Signed-off-by: Yurii <iuriish@yahoo.com>
* - prints to check shapes were added
Signed-off-by: Yurii <iuriish@yahoo.com>
* - correct type of output array Cin mmulNxN
Signed-off-by: Yurii <iuriish@yahoo.com>
* - take into account possible nans in C array
Signed-off-by: Yurii <iuriish@yahoo.com>
* slightly change numThreads message
Signed-off-by: raver119 <raver119@gmail.com>
* - make corrections in accordance to given notes in pr review
Signed-off-by: Yurii <iuriish@yahoo.com>
* Added implementation files for image_resize and resize_bicubic ops.
* Image resize and image.resize_bicubic ops implementation. Initial revision.
* Minor fix
* Some TF imports disabled.
* Finished with infrastructure development for image.resize_bilinear op and image_resizo op implementation.
* Refactored resize methods.
* Added processing for Mitchelcubic algorithm.
* adjust_contrast
* Small fix for TF import expected value loading when variable name starts with the test name
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Tests
* Tests added.
* Removed tf names absent in mapping.
* Some fixes.
* Small fixes
* Minor change
* Some failing tests.
* Disable failed test
* Ignore some tests
* Fix import class mapping
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Fix float property mapping (flatbuffers)
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Override equality function for model 'dropout'
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Fail tests
* Failed tests ignored temporarily.
* Minor fixes
* Small fix
* Conflict resolved
* Default implementations of tensorflowName and onnxName
* one range test
Signed-off-by: raver119 <raver119@gmail.com>
* few Context convenience singatures
Signed-off-by: raver119 <raver119@gmail.com>
* one more range test
Signed-off-by: raver119 <raver119@gmail.com>
* "range" "fix"
Signed-off-by: raver119 <raver119@gmail.com>
* adjuct_contrast_v2 now allows scale factor to be provided via input_variable
Signed-off-by: raver119 <raver119@gmail.com>
* adjust_contrast now allows scale factor as variable too
Signed-off-by: raver119 <raver119@gmail.com>
* bitcast shape tests
Signed-off-by: raver119 <raver119@gmail.com>
* BitCast import dtype added
Signed-off-by: raver119 <raver119@gmail.com>
* few more BitCast signatures
Signed-off-by: raver119 <raver119@gmail.com>
* - platform helpers can be disabled on per-op basis now via Context::allowHelpers
- java has access to it as well
Signed-off-by: raver119 <raver119@gmail.com>
* global platform-helpers trigger
Signed-off-by: raver119 <raver119@gmail.com>
* few signatures renamed
Signed-off-by: raver119 <raver119@gmail.com>
* - few new env variables to follow
- maxThreads/masterThreads differentiation
Signed-off-by: raver119 <raver119@gmail.com>
* Javadoc update
Signed-off-by: raver119 <raver119@gmail.com>
* #8280 biasadd_bp nchw arg fixes (java side) + test
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* #8285 Concat op Java side fixes
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Concat op cpp fix - allow dynamic axis to be negative, same as static axis
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* ignores for deconv3d import tests until deconv3d_tf op is implemented
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* - write code for new batchnorm backprop
Signed-off-by: Yurii <iuriish@yahoo.com>
* - testing batchnorm backprop
Signed-off-by: Yurii <iuriish@yahoo.com>
* - write code for batchnorm backprop based on mkl dnn api
Signed-off-by: Yurii <iuriish@yahoo.com>
* - testing and fixing bugs in batchnorm_bp mkl dnn
Signed-off-by: Yurii <iuriish@yahoo.com>
* - made corrections required by reviewer
Signed-off-by: Yurii <iuriish@yahoo.com>
* - change name in java wrapper for batchnorm op
Signed-off-by: Yurii <iuriish@yahoo.com>
* update javadocs and a few method signatures
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add PRelu op
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test and fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add PRelu op
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test and fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* slightly better test
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Fixed signatures. SameDiff tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Tests fixed
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test fixed
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Small fix
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Fixed test
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* fix execBackwards training issue
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix validation not specifying outputs
Signed-off-by: Ryan Nett <rnett@skymind.io>
* another fix for validation listeners and history
Signed-off-by: Ryan Nett <rnett@skymind.io>
* tests
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add single batch dataset output methods
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Small base spark test fix; ROC toString for empty ROC
Signed-off-by: Alex Black <blacka101@gmail.com>
* More fixes
Signed-off-by: Alex Black <blacka101@gmail.com>
* SDCNN cleanup
Signed-off-by: Ryan Nett <rnett@skymind.io>
* NonNull annotations
Signed-off-by: Ryan Nett <rnett@skymind.io>
* better javadoc, NonNull fix for sconv
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update builders to fix names
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* even more fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix for null bias
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Rename flatbuffers DataType enum to DType
Signed-off-by: Alex Black <blacka101@gmail.com>
* Rename flatbuffers DataType enum to DType
Signed-off-by: Alex Black <blacka101@gmail.com>
* Updates for flatbuffers datatype enum renaming
Signed-off-by: Alex Black <blacka101@gmail.com>
* new (for java at least) backprop ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update activation functions
Signed-off-by: Ryan Nett <rnett@skymind.io>
* add differential functions for SameDiff
Signed-off-by: Ryan Nett <rnett@skymind.io>
* deprecate old ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update correct old ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update ops backprop to use new ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* misc updates for deprecated functions (mostly Nd4j.rand w/ vararg shape)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove old imports
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Add java op class for relu derivative, and use in ACtivation ReLU
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Fix
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* - add one additional test for svd
* - provide float argument in eye op to be a type of output array
Signed-off-by: Yurii <yurii@skymind.io>
* - add cuda capability check to mmulHelper
Signed-off-by: Yurii <yurii@skymind.io>
* - make use another method for divice id evaluation
Signed-off-by: Yurii <yurii@skymind.io>
* Eye data type as T argument
Signed-off-by: raver119 <raver119@gmail.com>
* remove some unneeded java-side output shape calculations
Signed-off-by: Ryan Nett <rnett@skymind.io>
* delete Broadcast
Signed-off-by: Ryan Nett <rnett@skymind.io>
* delete Linear and Module,
Signed-off-by: Ryan Nett <rnett@skymind.io>
* update Identity, HashCode, and NoOp
Signed-off-by: Ryan Nett <rnett@skymind.io>
* removed Cast java-side shape function, added tests and SDVariable.isEmpty
Signed-off-by: Ryan Nett <rnett@skymind.io>
* ignoring test w/ issues on master
Signed-off-by: Ryan Nett <rnett@skymind.io>
* noop needs more work, fixed BaseArithmeticBackprop and BaseDynamicTransform ops
merge in master for c++ build fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix EqualTo
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix other cond ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* "fake" ops calculateOutputShape() throws exception
Signed-off-by: Ryan Nett <rnett@skymind.io>
* use c++ shape calc for Linspace
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix exception message, move most to BaseCompatOp
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove SDVariable.isEmpty
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove commented out code
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove unneeded resolveProperties methods
Signed-off-by: Ryan Nett <rnett@skymind.io>
* final fixes, make final to prevent more from being added
Signed-off-by: Ryan Nett <rnett@skymind.io>
* gather fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* deprecate DifferentialFunction resolveProps
Signed-off-by: Ryan Nett <rnett@skymind.io>
* small fix of compiler warnings in nd4j.
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* indarray javadoc start.
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* First steps for protobuf version upgrade
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Phase 2
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Update imports to shaded protobuf
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Version fix
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Switch to single execution for protobuf codegen to work around plugin bug
Signed-off-by: AlexDBlack <blacka101@gmail.com>
* Automatically delete old PB generated files after name change
Signed-off-by: Alex Black <blacka101@gmail.com>
* Nd4j pad update
Signed-off-by: Ryan Nett <rnett@skymind.io>
* switched from guava Immutables to Collections.unmodifiableList/Map
Signed-off-by: Ryan Nett <rnett@skymind.io>
* javadoc
Signed-off-by: Ryan Nett <rnett@skymind.io>
* use new pad
Signed-off-by: Ryan Nett <rnett@skymind.io>
* conv tests use OpValidation
Signed-off-by: Ryan Nett <rnett@skymind.io>
* deconv3d overrides
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test fix for the new pad method
Signed-off-by: Ryan Nett <rnett@skymind.io>
* more test fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* more test fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* rename SameDiff function methods to op (except for the actual SameDiff function ones)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* more pad overloads, test fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* test updates
Signed-off-by: Ryan Nett <rnett@skymind.io>
* conv1d test
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove Conv1D tf import (there isn't a TF conv1d op)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove numThreads from Nd4j
Signed-off-by: Ryan Nett <rnett@skymind.io>
* replace Old ops with their newer versions, deprecate ones that haven't already been deprecated
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove use of setNumThreads
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix for Reverse and ATan2
Signed-off-by: Ryan Nett <rnett@skymind.io>
* fix test for wrong equals type
Signed-off-by: Ryan Nett <rnett@skymind.io>
* well it works now
Signed-off-by: Ryan Nett <rnett@skymind.io>
* better javadocs
Signed-off-by: Ryan Nett <rnett@skymind.io>
* NonNulls
Signed-off-by: Ryan Nett <rnett@skymind.io>
* better array literal
Signed-off-by: Ryan Nett <rnett@skymind.io>
* re-add tf import stuff (will remove later)
Signed-off-by: Ryan Nett <rnett@skymind.io>
* conv1d config load fix
Signed-off-by: Ryan Nett <rnett@skymind.io>
* partial config usage changes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* remove Old op classes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* config property fixes
Signed-off-by: Ryan Nett <rnett@skymind.io>
* removed one too many ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
* Jar packaging for maven
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Typo fixed
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* minimal viable prototype for SD
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Tests corrected
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* few fixes for bfloat16 in java and cpp (#114)
Signed-off-by: raver119 <raver119@gmail.com>
* Nd4j refactoring (#112)
* refactoring
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* wip
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* wip
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* wip
* fix: make test public.
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* make test public.
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* fixes read refactoring.
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* Enabled test
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test copied from nd4j
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* [WIP] bitwise ops (#115)
* - cyclic_shift_bits + test
- shift_bits + test
Signed-off-by: raver119 <raver119@gmail.com>
* OMP_IF replacement
Signed-off-by: raver119 <raver119@gmail.com>
* Thin wrapper added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Cleanup
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Shugeo cuda tests (#116)
* Added tests for get_seed/set_seed ops.
* Added missed tests for scatter_sub/mul/div ops.
* Added tests for hardsigmoid and hardsigmoid_bp.
* Added tests for hardtanh and hardtanh_bp ops.
* Added test for histogram op.
* Added tests for identity op.
* Refactored mergemaxindex op. Added tests for log1p,mergemaxindex, mod and mod_bp ops.
* Fixed tests for FloorDiv.
* Added test for rank op.
* Added tests for rationaltanh/rationaltanh_bp ops.
* Added tests for realdiv/realdiv_bp.
* Added tests for rectifiedtanh/_bp ops.
* Added tests for shapes_of op.
* Added tests for shapes_of op.
* Added tests for size op.
* Added tests for softplus/_bp ops.
* Added tests for softsign/_bp ops.
* Added tests for toggle_bits op. Fixed processing of OP_IMPL and so on defititions.
* Added test for truncatediv op.
* Added another test for truncatediv op.
* Added another test for histogram.
* Added tests for unstack_list op.
* Refactored to_int32/uint32/float16/float32/double/int64/uint64 ops and tests.
* Refactored mergemaxindex op helper for cuda platform and tests.
* Fixed cuda kernel for histogram op helper.
* Refactor skipgram to avoid early buffers shift.
* Fixed check up with non_max_suppression op cuda helper. Added cuda kernel implementation for skipgram op helpers.
* Added implementation of skipgram op helper for cuda platform. Working revision
* Fixed mergeMaxIndex kernel and move it to separate source file.
* Adding arithmetic
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Eliminated memory leaks and dropped waste prints with tests. (#117)
* Added tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* fix test
Signed-off-by: raver119 <raver119@gmail.com>
* no openmp for ClipByGlobalNorm
Signed-off-by: raver119 <raver119@gmail.com>
* Stubs for ops
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* [WIP] right shift ops (#118)
* right shift ops
Signed-off-by: raver119 <raver119@gmail.com>
* typo
Signed-off-by: raver119 <raver119@gmail.com>
* rotr test
Signed-off-by: raver119 <raver119@gmail.com>
* fix: IOException no longer thrown by read(). (#120)
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* Small fix in TensorflowConversion class (#121)
Signed-off-by: Alex Black <blacka101@gmail.com>
* Shyrma concat2 (#119)
* - rewrite/improve concat
Signed-off-by: Yurii <yurii@skymind.io>
* - ged rid of unnecessary argument in concat kernel
Signed-off-by: Yurii <yurii@skymind.io>
* InferenceSession additional validation for shape calc (#122)
Signed-off-by: Alex Black <blacka101@gmail.com>
* [WIP] build fix (#124)
* AffinityManager changes
Signed-off-by: raver119 <raver119@gmail.com>
* build fixes
Signed-off-by: raver119 <raver119@gmail.com>
* OP/CONFIGURABLE_OP shapefn fix (#125)
Signed-off-by: raver119 <raver119@gmail.com>
* Some ops added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Nd4j refactoring (last one!) (#123)
* fix: IOException no longer thrown by read().
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* refactoring
* last refactorings
Signed-off-by: Robert Altena <Rob@Ra-ai.com>
* Advanced tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* [WIP] Java wrappers (#126)
* shift/rshift/rotl/rotr java/sd wrappers
Signed-off-by: raver119 <raver119@gmail.com>
* few additional wrappers
Signed-off-by: raver119 <raver119@gmail.com>
* minor naming tweak
Signed-off-by: raver119 <raver119@gmail.com>
* Test added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* one more build fix
Signed-off-by: raver119 <raver119@gmail.com>
* Jar packaging for maven
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Typo fixed
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* minimal viable prototype for SD
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Tests corrected
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Enabled test
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test copied from nd4j
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Thin wrapper added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Cleanup
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Adding arithmetic
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Added tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Stubs for ops
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Some ops added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Advanced tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Ops added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Small build fixes (#127)
* Small build fixes
Signed-off-by: Alex Black <blacka101@gmail.com>
* Fix RL4J
Signed-off-by: Alex Black <blacka101@gmail.com>
* Test fixes
Signed-off-by: Alex Black <blacka101@gmail.com>
* Another fix
Signed-off-by: Alex Black <blacka101@gmail.com>
* parent module name fix
Signed-off-by: raver119 <raver119@gmail.com>
* [WIP] Roll rewritten (#128)
* Process correct input vector.
* Added tests for roll.
* Refactored roll to conform with TF. Eliminated memory leaks with Roll op tests.
* no thread_local for cpu
Signed-off-by: raver119 <raver119@gmail.com>
* Jar packaging for maven
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Typo fixed
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* minimal viable prototype for SD
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Tests corrected
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Enabled test
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test copied from nd4j
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Thin wrapper added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Cleanup
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Adding arithmetic
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Added tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Stubs for ops
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Some ops added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Advanced tests
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Ops added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Tests added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Boolen logic ops
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Test added
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>
* Shift operations
Signed-off-by: Alexander Stoyakin <alexander.stoyakin@gmail.com>